Feb 26 19:54:16 crc systemd[1]: Starting Kubernetes Kubelet... Feb 26 19:54:16 crc restorecon[4684]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:16 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 19:54:17 crc restorecon[4684]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 19:54:17 crc restorecon[4684]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 26 19:54:17 crc kubenswrapper[4722]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 26 19:54:17 crc kubenswrapper[4722]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 26 19:54:17 crc kubenswrapper[4722]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 26 19:54:17 crc kubenswrapper[4722]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 26 19:54:17 crc kubenswrapper[4722]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 26 19:54:17 crc kubenswrapper[4722]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.892263 4722 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904435 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904469 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904480 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904489 4722 feature_gate.go:330] unrecognized feature gate: Example Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904498 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904510 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904522 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904534 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904544 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904555 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904565 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904576 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904586 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904601 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904616 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904627 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904638 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904650 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904661 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904672 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904682 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904693 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904703 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904725 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904736 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904746 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904757 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904768 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904777 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904785 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904793 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904801 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904809 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904819 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904827 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904836 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904845 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904856 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904866 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904877 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904888 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904898 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904909 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904919 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904929 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904938 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904947 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904957 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904965 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904980 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904991 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.904999 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905007 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905018 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905030 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905041 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905051 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905060 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905069 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905077 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905086 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905094 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905103 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905111 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905120 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905128 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905167 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905181 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905193 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905202 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.905211 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905374 4722 flags.go:64] FLAG: --address="0.0.0.0" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905418 4722 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905439 4722 flags.go:64] FLAG: --anonymous-auth="true" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905454 4722 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905469 4722 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905482 4722 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905498 4722 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905525 4722 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905539 4722 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905552 4722 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905566 4722 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905579 4722 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905592 4722 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905601 4722 flags.go:64] FLAG: --cgroup-root="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905611 4722 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905621 4722 flags.go:64] FLAG: --client-ca-file="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905631 4722 flags.go:64] FLAG: --cloud-config="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905640 4722 flags.go:64] FLAG: --cloud-provider="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905650 4722 flags.go:64] FLAG: --cluster-dns="[]" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905661 4722 flags.go:64] FLAG: --cluster-domain="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905671 4722 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905681 4722 flags.go:64] FLAG: --config-dir="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905690 4722 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905701 4722 flags.go:64] FLAG: --container-log-max-files="5" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905729 4722 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905744 4722 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905756 4722 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905770 4722 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905782 4722 flags.go:64] FLAG: --contention-profiling="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905795 4722 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905808 4722 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905821 4722 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905833 4722 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905860 4722 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905873 4722 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905885 4722 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905896 4722 flags.go:64] FLAG: --enable-load-reader="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905908 4722 flags.go:64] FLAG: --enable-server="true" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905920 4722 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905939 4722 flags.go:64] FLAG: --event-burst="100" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905952 4722 flags.go:64] FLAG: --event-qps="50" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905964 4722 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905976 4722 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.905989 4722 flags.go:64] FLAG: --eviction-hard="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906004 4722 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906016 4722 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906030 4722 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906045 4722 flags.go:64] FLAG: --eviction-soft="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906057 4722 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906069 4722 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906082 4722 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906099 4722 flags.go:64] FLAG: --experimental-mounter-path="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906110 4722 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906122 4722 flags.go:64] FLAG: --fail-swap-on="true" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906176 4722 flags.go:64] FLAG: --feature-gates="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906194 4722 flags.go:64] FLAG: --file-check-frequency="20s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906207 4722 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906220 4722 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906233 4722 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906245 4722 flags.go:64] FLAG: --healthz-port="10248" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906258 4722 flags.go:64] FLAG: --help="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906270 4722 flags.go:64] FLAG: --hostname-override="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906283 4722 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906296 4722 flags.go:64] FLAG: --http-check-frequency="20s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906309 4722 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906321 4722 flags.go:64] FLAG: --image-credential-provider-config="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906333 4722 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906346 4722 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906357 4722 flags.go:64] FLAG: --image-service-endpoint="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906369 4722 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906382 4722 flags.go:64] FLAG: --kube-api-burst="100" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906394 4722 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906409 4722 flags.go:64] FLAG: --kube-api-qps="50" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906421 4722 flags.go:64] FLAG: --kube-reserved="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906435 4722 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906447 4722 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906460 4722 flags.go:64] FLAG: --kubelet-cgroups="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906473 4722 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906486 4722 flags.go:64] FLAG: --lock-file="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906502 4722 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906515 4722 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906528 4722 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906548 4722 flags.go:64] FLAG: --log-json-split-stream="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906561 4722 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906573 4722 flags.go:64] FLAG: --log-text-split-stream="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906585 4722 flags.go:64] FLAG: --logging-format="text" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906596 4722 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906609 4722 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906621 4722 flags.go:64] FLAG: --manifest-url="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906633 4722 flags.go:64] FLAG: --manifest-url-header="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906649 4722 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906661 4722 flags.go:64] FLAG: --max-open-files="1000000" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906676 4722 flags.go:64] FLAG: --max-pods="110" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906688 4722 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906701 4722 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906713 4722 flags.go:64] FLAG: --memory-manager-policy="None" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906725 4722 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906737 4722 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906750 4722 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906763 4722 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906792 4722 flags.go:64] FLAG: --node-status-max-images="50" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906804 4722 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906817 4722 flags.go:64] FLAG: --oom-score-adj="-999" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906829 4722 flags.go:64] FLAG: --pod-cidr="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906842 4722 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906861 4722 flags.go:64] FLAG: --pod-manifest-path="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906873 4722 flags.go:64] FLAG: --pod-max-pids="-1" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906887 4722 flags.go:64] FLAG: --pods-per-core="0" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906900 4722 flags.go:64] FLAG: --port="10250" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906913 4722 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906925 4722 flags.go:64] FLAG: --provider-id="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906937 4722 flags.go:64] FLAG: --qos-reserved="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906949 4722 flags.go:64] FLAG: --read-only-port="10255" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906962 4722 flags.go:64] FLAG: --register-node="true" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906974 4722 flags.go:64] FLAG: --register-schedulable="true" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.906988 4722 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907020 4722 flags.go:64] FLAG: --registry-burst="10" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907032 4722 flags.go:64] FLAG: --registry-qps="5" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907045 4722 flags.go:64] FLAG: --reserved-cpus="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907057 4722 flags.go:64] FLAG: --reserved-memory="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907073 4722 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907085 4722 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907097 4722 flags.go:64] FLAG: --rotate-certificates="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907109 4722 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907122 4722 flags.go:64] FLAG: --runonce="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907173 4722 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907189 4722 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907203 4722 flags.go:64] FLAG: --seccomp-default="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907215 4722 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907228 4722 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907248 4722 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907260 4722 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907273 4722 flags.go:64] FLAG: --storage-driver-password="root" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907285 4722 flags.go:64] FLAG: --storage-driver-secure="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907298 4722 flags.go:64] FLAG: --storage-driver-table="stats" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907310 4722 flags.go:64] FLAG: --storage-driver-user="root" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907322 4722 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907335 4722 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907347 4722 flags.go:64] FLAG: --system-cgroups="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907359 4722 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907381 4722 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907391 4722 flags.go:64] FLAG: --tls-cert-file="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907400 4722 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907413 4722 flags.go:64] FLAG: --tls-min-version="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907423 4722 flags.go:64] FLAG: --tls-private-key-file="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907433 4722 flags.go:64] FLAG: --topology-manager-policy="none" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907442 4722 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907452 4722 flags.go:64] FLAG: --topology-manager-scope="container" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907462 4722 flags.go:64] FLAG: --v="2" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907475 4722 flags.go:64] FLAG: --version="false" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907488 4722 flags.go:64] FLAG: --vmodule="" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907501 4722 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.907512 4722 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907765 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907779 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907789 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907798 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907806 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907815 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907823 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907832 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907841 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907853 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907862 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907870 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907878 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907887 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907896 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907904 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907912 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907923 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907932 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907940 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907949 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907957 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907965 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907974 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907983 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.907992 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908000 4722 feature_gate.go:330] unrecognized feature gate: Example Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908008 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908020 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908032 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908045 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908058 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908070 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908083 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908099 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908111 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908123 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908176 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908202 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908215 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908226 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908238 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908258 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908274 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908288 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908299 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908310 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908321 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908331 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908421 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908430 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908439 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908447 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908456 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908464 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908552 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908563 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908571 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908583 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908594 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908605 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908614 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908623 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908631 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908640 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908689 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908699 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908708 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908717 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908726 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.908738 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.908752 4722 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.918584 4722 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.918625 4722 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918743 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918754 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918762 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918768 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918773 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918779 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918784 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918789 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918794 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918799 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918804 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918808 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918813 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918818 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918823 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918828 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918833 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918838 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918843 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918848 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918853 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918858 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918863 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918868 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918873 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918878 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918882 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918887 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918892 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918897 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918902 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918907 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918911 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918916 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918922 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918928 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918933 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918939 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918946 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918952 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918957 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918963 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918968 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918973 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918978 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918983 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918988 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918993 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.918998 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919003 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919008 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919014 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919022 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919028 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919033 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919039 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919044 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919051 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919057 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919062 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919067 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919073 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919079 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919084 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919089 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919094 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919099 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919104 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919109 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919114 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919119 4722 feature_gate.go:330] unrecognized feature gate: Example Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.919127 4722 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919333 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919350 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919356 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919361 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919367 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919372 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919379 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919384 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919389 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919395 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919400 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919405 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919410 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919415 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919420 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919424 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919429 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919434 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919439 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919444 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919449 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919453 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919458 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919464 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919469 4722 feature_gate.go:330] unrecognized feature gate: Example Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919474 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919479 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919484 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919488 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919493 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919498 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919504 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919510 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919515 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919520 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919526 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919531 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919535 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919542 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919549 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919554 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919559 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919564 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919570 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919574 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919579 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919584 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919589 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919594 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919599 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919603 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919608 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919613 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919619 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919626 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919631 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919637 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919642 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919647 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919653 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919659 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919665 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919670 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919675 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919680 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919685 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919690 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919695 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919700 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919705 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 26 19:54:17 crc kubenswrapper[4722]: W0226 19:54:17.919712 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.919719 4722 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.921017 4722 server.go:940] "Client rotation is on, will bootstrap in background" Feb 26 19:54:17 crc kubenswrapper[4722]: E0226 19:54:17.924833 4722 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.929356 4722 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.929559 4722 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.931461 4722 server.go:997] "Starting client certificate rotation" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.931506 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.931643 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.957497 4722 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.959598 4722 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 26 19:54:17 crc kubenswrapper[4722]: E0226 19:54:17.960094 4722 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Feb 26 19:54:17 crc kubenswrapper[4722]: I0226 19:54:17.977072 4722 log.go:25] "Validated CRI v1 runtime API" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.014619 4722 log.go:25] "Validated CRI v1 image API" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.016722 4722 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.023129 4722 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-26-19-49-47-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.023206 4722 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.052766 4722 manager.go:217] Machine: {Timestamp:2026-02-26 19:54:18.047556241 +0000 UTC m=+0.584524255 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4d7c2ae8-1227-4493-892d-cf55e117ead1 BootID:9fe5d4dc-8478-4c5a-97be-0b5527bf8c18 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c7:54:eb Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c7:54:eb Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:41:ee:c4 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:c7:5c:d3 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:aa:a2:16 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:e2:c5:e0 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:2e:2b:ac:71:82:bb Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:92:32:8b:49:29:c7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.053219 4722 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.053398 4722 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.055304 4722 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.055622 4722 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.055687 4722 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.056044 4722 topology_manager.go:138] "Creating topology manager with none policy" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.056065 4722 container_manager_linux.go:303] "Creating device plugin manager" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.056920 4722 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.056972 4722 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.057282 4722 state_mem.go:36] "Initialized new in-memory state store" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.057421 4722 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.062350 4722 kubelet.go:418] "Attempting to sync node with API server" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.062385 4722 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.062436 4722 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.062458 4722 kubelet.go:324] "Adding apiserver pod source" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.062476 4722 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.067278 4722 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 26 19:54:18 crc kubenswrapper[4722]: W0226 19:54:18.067737 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.067848 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Feb 26 19:54:18 crc kubenswrapper[4722]: W0226 19:54:18.067830 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.067925 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.068249 4722 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.069853 4722 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071379 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071419 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071434 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071448 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071471 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071485 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071499 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071521 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071537 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071550 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071593 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071607 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.071638 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.072355 4722 server.go:1280] "Started kubelet" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.072414 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.075062 4722 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.075127 4722 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.076012 4722 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 26 19:54:18 crc systemd[1]: Started Kubernetes Kubelet. Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.079558 4722 server.go:460] "Adding debug handlers to kubelet server" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.080105 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.080182 4722 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.080617 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.080653 4722 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.080661 4722 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.080703 4722 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 26 19:54:18 crc kubenswrapper[4722]: W0226 19:54:18.081392 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.081636 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.082774 4722 factory.go:55] Registering systemd factory Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.082820 4722 factory.go:221] Registration of the systemd container factory successfully Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.085248 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="200ms" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.085371 4722 factory.go:153] Registering CRI-O factory Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.085405 4722 factory.go:221] Registration of the crio container factory successfully Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.085513 4722 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.085552 4722 factory.go:103] Registering Raw factory Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.085579 4722 manager.go:1196] Started watching for new ooms in manager Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.086501 4722 manager.go:319] Starting recovery of all containers Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.088000 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1897e3fde933e5d3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.072303059 +0000 UTC m=+0.609271013,LastTimestamp:2026-02-26 19:54:18.072303059 +0000 UTC m=+0.609271013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099373 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099439 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099461 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099485 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099506 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099527 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099547 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099568 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099593 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099613 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099632 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099650 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099668 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099691 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099710 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099731 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099751 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099769 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099787 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099807 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099826 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099877 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099896 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099915 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099963 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.099982 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100004 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100025 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100059 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100080 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100099 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100118 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100178 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100198 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100216 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100237 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100255 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100272 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100291 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100310 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100330 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100350 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100368 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100387 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100407 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100426 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100444 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100464 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100483 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100501 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100520 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100538 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100564 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100582 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100604 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100824 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100858 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100881 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100900 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100923 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100945 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100965 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.100984 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101007 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101027 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101045 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101066 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101087 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101179 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101201 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101223 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101242 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101259 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101277 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101297 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101315 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101336 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101354 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101371 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101392 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101411 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101430 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101449 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101467 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101485 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101505 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101524 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101610 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101629 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.101648 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103372 4722 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103410 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103434 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103455 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103475 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103497 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103516 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103534 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103554 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103574 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103593 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103612 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103632 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103649 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103669 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103693 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103714 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103737 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103842 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103864 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103885 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103906 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103928 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103948 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103967 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.103986 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104004 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104022 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104041 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104061 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104079 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104097 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104116 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104277 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104299 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104318 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104335 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104357 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104374 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104393 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104414 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104432 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104449 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104469 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104487 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104504 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104523 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104541 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104560 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104578 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104595 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104613 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104631 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104653 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104670 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104724 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104748 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104769 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104787 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104804 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104825 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104886 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104906 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104925 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104943 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104961 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104980 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.104998 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105017 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105037 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105055 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105077 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105096 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105115 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105159 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105182 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105200 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105219 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105239 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105258 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105276 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105294 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105312 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105330 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105347 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105366 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105383 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105400 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105419 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105437 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105455 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105472 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105490 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105509 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105526 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105545 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105567 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105585 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105605 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105624 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105643 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105664 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105682 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105700 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105717 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105736 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105754 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105774 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105791 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105809 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105826 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105844 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105863 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105880 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105898 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105917 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105935 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105953 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105970 4722 reconstruct.go:97] "Volume reconstruction finished" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.105984 4722 reconciler.go:26] "Reconciler: start to sync state" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.120574 4722 manager.go:324] Recovery completed Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.137754 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.139232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.139266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.139279 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.140219 4722 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.140261 4722 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.140292 4722 state_mem.go:36] "Initialized new in-memory state store" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.142287 4722 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.144683 4722 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.144720 4722 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.144739 4722 kubelet.go:2335] "Starting kubelet main sync loop" Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.144821 4722 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 26 19:54:18 crc kubenswrapper[4722]: W0226 19:54:18.145994 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.146048 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.161002 4722 policy_none.go:49] "None policy: Start" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.161865 4722 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.161906 4722 state_mem.go:35] "Initializing new in-memory state store" Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.181275 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.204240 4722 manager.go:334] "Starting Device Plugin manager" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.204283 4722 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.204294 4722 server.go:79] "Starting device plugin registration server" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.204733 4722 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.204760 4722 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.204906 4722 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.204975 4722 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.204982 4722 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.210545 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.245797 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.245929 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.247508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.247543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.247568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.247705 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.248829 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.248882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.248896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.249240 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.249303 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.249378 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.249426 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.249306 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.250348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.250389 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.250400 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.250510 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.250688 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.250752 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.250810 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.250841 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.250854 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.251281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.251300 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.251308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.251403 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.251398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.251491 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.251511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.251524 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.251590 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.251788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.251815 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.251828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.252417 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.252465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.252474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.252554 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.252568 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.252573 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.252586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.252587 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.253212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.253232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.253240 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.286110 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="400ms" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.304872 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.306723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.306809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.306836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.306901 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.307668 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.307689 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.307715 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.307731 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.307747 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.307761 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.307774 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.307787 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.307801 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.307816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.307892 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.307984 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.308035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.308062 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.308085 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.308105 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409553 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409575 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409597 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409617 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409635 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409672 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409695 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409704 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409749 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409753 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409704 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409715 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409840 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409900 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409921 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409921 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.410016 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.410042 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.410049 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.410068 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.410088 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.410095 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.409984 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.410115 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.410121 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.508384 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.509880 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.509923 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.509935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.509962 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.510419 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.605709 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.630743 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: W0226 19:54:18.645912 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-85c1bc1438c96cc0d4b70bf9f8a96f273f7607e618c196f4b66b1d192517dc5a WatchSource:0}: Error finding container 85c1bc1438c96cc0d4b70bf9f8a96f273f7607e618c196f4b66b1d192517dc5a: Status 404 returned error can't find the container with id 85c1bc1438c96cc0d4b70bf9f8a96f273f7607e618c196f4b66b1d192517dc5a Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.652785 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: W0226 19:54:18.656755 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-75c2cf7dd20c08f4b76ac9ef1a3f7a27b543a686fc9bf0786d3869bcef98b1f2 WatchSource:0}: Error finding container 75c2cf7dd20c08f4b76ac9ef1a3f7a27b543a686fc9bf0786d3869bcef98b1f2: Status 404 returned error can't find the container with id 75c2cf7dd20c08f4b76ac9ef1a3f7a27b543a686fc9bf0786d3869bcef98b1f2 Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.683255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.687414 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="800ms" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.689649 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:18 crc kubenswrapper[4722]: W0226 19:54:18.696895 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-80c3336a8c8560de9afc2360ac255658e1f600918f1f22a5768b61259083e376 WatchSource:0}: Error finding container 80c3336a8c8560de9afc2360ac255658e1f600918f1f22a5768b61259083e376: Status 404 returned error can't find the container with id 80c3336a8c8560de9afc2360ac255658e1f600918f1f22a5768b61259083e376 Feb 26 19:54:18 crc kubenswrapper[4722]: W0226 19:54:18.706779 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-45b6c50a12bc5e2b43b12efa1ecee87747276422141d5134b2093904fba3bd45 WatchSource:0}: Error finding container 45b6c50a12bc5e2b43b12efa1ecee87747276422141d5134b2093904fba3bd45: Status 404 returned error can't find the container with id 45b6c50a12bc5e2b43b12efa1ecee87747276422141d5134b2093904fba3bd45 Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.910773 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.912071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.912325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.912350 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:18 crc kubenswrapper[4722]: I0226 19:54:18.912373 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:54:18 crc kubenswrapper[4722]: E0226 19:54:18.912824 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Feb 26 19:54:19 crc kubenswrapper[4722]: W0226 19:54:19.049188 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:19 crc kubenswrapper[4722]: E0226 19:54:19.049265 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Feb 26 19:54:19 crc kubenswrapper[4722]: I0226 19:54:19.073181 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:19 crc kubenswrapper[4722]: I0226 19:54:19.148000 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"85c1bc1438c96cc0d4b70bf9f8a96f273f7607e618c196f4b66b1d192517dc5a"} Feb 26 19:54:19 crc kubenswrapper[4722]: I0226 19:54:19.150734 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"45b6c50a12bc5e2b43b12efa1ecee87747276422141d5134b2093904fba3bd45"} Feb 26 19:54:19 crc kubenswrapper[4722]: I0226 19:54:19.154380 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"80c3336a8c8560de9afc2360ac255658e1f600918f1f22a5768b61259083e376"} Feb 26 19:54:19 crc kubenswrapper[4722]: I0226 19:54:19.155156 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7d90eb358a78997bc3cf0f1fa1c2b73c1a4563dde623c4e18bffd5c418a76684"} Feb 26 19:54:19 crc kubenswrapper[4722]: I0226 19:54:19.156054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"75c2cf7dd20c08f4b76ac9ef1a3f7a27b543a686fc9bf0786d3869bcef98b1f2"} Feb 26 19:54:19 crc kubenswrapper[4722]: W0226 19:54:19.232813 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:19 crc kubenswrapper[4722]: E0226 19:54:19.232901 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Feb 26 19:54:19 crc kubenswrapper[4722]: W0226 19:54:19.367543 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:19 crc kubenswrapper[4722]: E0226 19:54:19.367608 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Feb 26 19:54:19 crc kubenswrapper[4722]: E0226 19:54:19.488366 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="1.6s" Feb 26 19:54:19 crc kubenswrapper[4722]: W0226 19:54:19.554936 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:19 crc kubenswrapper[4722]: E0226 19:54:19.555082 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Feb 26 19:54:19 crc kubenswrapper[4722]: I0226 19:54:19.713693 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:19 crc kubenswrapper[4722]: I0226 19:54:19.718425 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:19 crc kubenswrapper[4722]: I0226 19:54:19.718465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:19 crc kubenswrapper[4722]: I0226 19:54:19.718479 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:19 crc kubenswrapper[4722]: I0226 19:54:19.718506 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:54:19 crc kubenswrapper[4722]: E0226 19:54:19.718972 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.073938 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.076948 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 19:54:20 crc kubenswrapper[4722]: E0226 19:54:20.077924 4722 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Feb 26 19:54:20 crc kubenswrapper[4722]: E0226 19:54:20.086758 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1897e3fde933e5d3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.072303059 +0000 UTC m=+0.609271013,LastTimestamp:2026-02-26 19:54:18.072303059 +0000 UTC m=+0.609271013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.161369 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f"} Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.161408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258"} Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.161418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a"} Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.161426 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e"} Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.161448 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.163529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.163557 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.163567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.163908 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04" exitCode=0 Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.163966 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04"} Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.163952 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.164802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.164831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.164842 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.165851 4722 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4c025095a190a876bfdbf6f1e74875ec58cf72c1b83fdf9f26d75eebf09ea6fd" exitCode=0 Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.165919 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4c025095a190a876bfdbf6f1e74875ec58cf72c1b83fdf9f26d75eebf09ea6fd"} Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.166037 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.166939 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.167312 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.167360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.167371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.167616 4722 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e" exitCode=0 Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.167669 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e"} Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.167682 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.167798 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.167852 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.167870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.168603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.168625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.168633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.170054 4722 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad" exitCode=0 Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.170120 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad"} Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.170290 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.172294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.172359 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:20 crc kubenswrapper[4722]: I0226 19:54:20.172379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:20 crc kubenswrapper[4722]: W0226 19:54:20.907119 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:20 crc kubenswrapper[4722]: E0226 19:54:20.907216 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.073497 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:21 crc kubenswrapper[4722]: E0226 19:54:21.089316 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="3.2s" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.174488 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5a9f752b4b78d0a8f847fc51a4a41749cbb200fff7595e1ddb9f8d580d432b31"} Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.174535 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.174540 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698"} Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.174556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594"} Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.174571 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336"} Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.174585 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20"} Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.175310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.175360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.175374 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.175979 4722 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2a86f27f511be84a6e6519a11f7c2833e146be2b90cfa0f1228ffed32ce1615e" exitCode=0 Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.176044 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2a86f27f511be84a6e6519a11f7c2833e146be2b90cfa0f1228ffed32ce1615e"} Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.176049 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.176651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.176687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.176699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.177527 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"dde6433c3955ab42d8bf834f7508824e80021ad2d4cb47a9b0ae35482615caa5"} Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.177553 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.178097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.178121 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.178420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.183994 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.184513 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.184885 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd"} Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.184926 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c"} Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.184938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711"} Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.185419 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.185449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.185460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.185580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.185634 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.185653 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:21 crc kubenswrapper[4722]: W0226 19:54:21.282127 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.9:6443: connect: connection refused Feb 26 19:54:21 crc kubenswrapper[4722]: E0226 19:54:21.282210 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.9:6443: connect: connection refused" logger="UnhandledError" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.319773 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.321129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.321170 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.321180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.321200 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:54:21 crc kubenswrapper[4722]: E0226 19:54:21.321941 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.9:6443: connect: connection refused" node="crc" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.448292 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.521990 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.522293 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.522338 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" Feb 26 19:54:21 crc kubenswrapper[4722]: I0226 19:54:21.765565 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.190162 4722 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="be248bf43817975c22081d959ba6543f23a058ea87663922abfa721de25c5410" exitCode=0 Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.190272 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.190310 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.190549 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.190593 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.190365 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"be248bf43817975c22081d959ba6543f23a058ea87663922abfa721de25c5410"} Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.190512 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.192614 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.192646 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.192659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.192675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.192693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.192678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.193618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.193659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.193680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.193845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.193881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.193900 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.309859 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.310094 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.311820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.311858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.311873 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:22 crc kubenswrapper[4722]: I0226 19:54:22.317921 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.186430 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.195893 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9073a1c88735e9e00c2332d6615d61dfa4794cb89be27db10df29ccf0614dc41"} Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.195934 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"45d23a64bcdecb7b3c3af4e5b3b6ebbeeabde099fcbc9ffe6c844913e53b3889"} Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.195952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"41c4b07a88f55918dbcd7136aaf157af63386ad3c03605a48bf45c27d8defb79"} Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.195968 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"03310a3fe7e38b4a89ded37ad392faa9e07f5cf7a261d5cb34625013d4856608"} Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.195991 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.196058 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.196978 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.197009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.197021 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.197065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.197083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:23 crc kubenswrapper[4722]: I0226 19:54:23.197094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.034831 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.209526 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"06ba4ffc96221354be83ab1d9dc2e9f7d362d6cdc22315d0f8d880f063131d6b"} Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.209636 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.209703 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.211220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.211290 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.211317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.211244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.211463 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.211488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.273244 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.273507 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.275078 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.275164 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.275183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.335000 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.522913 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.524097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.524131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.524164 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:24 crc kubenswrapper[4722]: I0226 19:54:24.524191 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:54:25 crc kubenswrapper[4722]: I0226 19:54:25.211956 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:25 crc kubenswrapper[4722]: I0226 19:54:25.211973 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:25 crc kubenswrapper[4722]: I0226 19:54:25.213771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:25 crc kubenswrapper[4722]: I0226 19:54:25.213803 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:25 crc kubenswrapper[4722]: I0226 19:54:25.213815 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:25 crc kubenswrapper[4722]: I0226 19:54:25.213836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:25 crc kubenswrapper[4722]: I0226 19:54:25.213866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:25 crc kubenswrapper[4722]: I0226 19:54:25.213878 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:26 crc kubenswrapper[4722]: I0226 19:54:26.712000 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:26 crc kubenswrapper[4722]: I0226 19:54:26.712268 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:26 crc kubenswrapper[4722]: I0226 19:54:26.713808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:26 crc kubenswrapper[4722]: I0226 19:54:26.713864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:26 crc kubenswrapper[4722]: I0226 19:54:26.713889 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:27 crc kubenswrapper[4722]: I0226 19:54:27.035103 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 19:54:27 crc kubenswrapper[4722]: I0226 19:54:27.035209 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 19:54:28 crc kubenswrapper[4722]: E0226 19:54:28.210841 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:54:28 crc kubenswrapper[4722]: I0226 19:54:28.621429 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 26 19:54:28 crc kubenswrapper[4722]: I0226 19:54:28.621625 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:28 crc kubenswrapper[4722]: I0226 19:54:28.622593 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:28 crc kubenswrapper[4722]: I0226 19:54:28.622625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:28 crc kubenswrapper[4722]: I0226 19:54:28.622634 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:28 crc kubenswrapper[4722]: I0226 19:54:28.938467 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 26 19:54:29 crc kubenswrapper[4722]: I0226 19:54:29.222270 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:29 crc kubenswrapper[4722]: I0226 19:54:29.223338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:29 crc kubenswrapper[4722]: I0226 19:54:29.223363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:29 crc kubenswrapper[4722]: I0226 19:54:29.223375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:31 crc kubenswrapper[4722]: I0226 19:54:31.449467 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 26 19:54:31 crc kubenswrapper[4722]: I0226 19:54:31.449541 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 26 19:54:31 crc kubenswrapper[4722]: I0226 19:54:31.642958 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z Feb 26 19:54:31 crc kubenswrapper[4722]: W0226 19:54:31.647974 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z Feb 26 19:54:31 crc kubenswrapper[4722]: E0226 19:54:31.648063 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 19:54:31 crc kubenswrapper[4722]: E0226 19:54:31.649783 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897e3fde933e5d3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.072303059 +0000 UTC m=+0.609271013,LastTimestamp:2026-02-26 19:54:18.072303059 +0000 UTC m=+0.609271013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:54:31 crc kubenswrapper[4722]: W0226 19:54:31.651809 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z Feb 26 19:54:31 crc kubenswrapper[4722]: E0226 19:54:31.651890 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 19:54:31 crc kubenswrapper[4722]: E0226 19:54:31.653400 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 19:54:31 crc kubenswrapper[4722]: W0226 19:54:31.656892 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z Feb 26 19:54:31 crc kubenswrapper[4722]: E0226 19:54:31.656985 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 19:54:31 crc kubenswrapper[4722]: W0226 19:54:31.658287 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z Feb 26 19:54:31 crc kubenswrapper[4722]: E0226 19:54:31.658332 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 19:54:31 crc kubenswrapper[4722]: I0226 19:54:31.661703 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 19:54:31 crc kubenswrapper[4722]: I0226 19:54:31.661750 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 26 19:54:31 crc kubenswrapper[4722]: I0226 19:54:31.667102 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 19:54:31 crc kubenswrapper[4722]: I0226 19:54:31.667192 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 26 19:54:31 crc kubenswrapper[4722]: E0226 19:54:31.668468 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 26 19:54:31 crc kubenswrapper[4722]: E0226 19:54:31.669695 4722 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 19:54:32 crc kubenswrapper[4722]: I0226 19:54:32.075038 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:32Z is after 2026-02-23T05:33:13Z Feb 26 19:54:32 crc kubenswrapper[4722]: I0226 19:54:32.230039 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 26 19:54:32 crc kubenswrapper[4722]: I0226 19:54:32.232019 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5a9f752b4b78d0a8f847fc51a4a41749cbb200fff7595e1ddb9f8d580d432b31" exitCode=255 Feb 26 19:54:32 crc kubenswrapper[4722]: I0226 19:54:32.232065 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5a9f752b4b78d0a8f847fc51a4a41749cbb200fff7595e1ddb9f8d580d432b31"} Feb 26 19:54:32 crc kubenswrapper[4722]: I0226 19:54:32.232235 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:32 crc kubenswrapper[4722]: I0226 19:54:32.233263 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:32 crc kubenswrapper[4722]: I0226 19:54:32.233302 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:32 crc kubenswrapper[4722]: I0226 19:54:32.233311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:32 crc kubenswrapper[4722]: I0226 19:54:32.233794 4722 scope.go:117] "RemoveContainer" containerID="5a9f752b4b78d0a8f847fc51a4a41749cbb200fff7595e1ddb9f8d580d432b31" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.075957 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:33Z is after 2026-02-23T05:33:13Z Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.190727 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.190868 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.192185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.192221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.192230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.236017 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.236816 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.238858 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8bc474d93b21ce189f0098d126a7a7ebe84292f291fbcbead88fd70885535565" exitCode=255 Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.238916 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8bc474d93b21ce189f0098d126a7a7ebe84292f291fbcbead88fd70885535565"} Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.238986 4722 scope.go:117] "RemoveContainer" containerID="5a9f752b4b78d0a8f847fc51a4a41749cbb200fff7595e1ddb9f8d580d432b31" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.239231 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.240500 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.240554 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.240575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:33 crc kubenswrapper[4722]: I0226 19:54:33.241735 4722 scope.go:117] "RemoveContainer" containerID="8bc474d93b21ce189f0098d126a7a7ebe84292f291fbcbead88fd70885535565" Feb 26 19:54:33 crc kubenswrapper[4722]: E0226 19:54:33.242043 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:54:34 crc kubenswrapper[4722]: I0226 19:54:34.075772 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:34Z is after 2026-02-23T05:33:13Z Feb 26 19:54:34 crc kubenswrapper[4722]: I0226 19:54:34.243912 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 19:54:35 crc kubenswrapper[4722]: I0226 19:54:35.078046 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:35Z is after 2026-02-23T05:33:13Z Feb 26 19:54:35 crc kubenswrapper[4722]: W0226 19:54:35.951379 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:35Z is after 2026-02-23T05:33:13Z Feb 26 19:54:35 crc kubenswrapper[4722]: E0226 19:54:35.951449 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 19:54:36 crc kubenswrapper[4722]: I0226 19:54:36.076361 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:36Z is after 2026-02-23T05:33:13Z Feb 26 19:54:36 crc kubenswrapper[4722]: I0226 19:54:36.527390 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:36 crc kubenswrapper[4722]: I0226 19:54:36.527557 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:36 crc kubenswrapper[4722]: I0226 19:54:36.529024 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:36 crc kubenswrapper[4722]: I0226 19:54:36.529065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:36 crc kubenswrapper[4722]: I0226 19:54:36.529077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:36 crc kubenswrapper[4722]: I0226 19:54:36.529624 4722 scope.go:117] "RemoveContainer" containerID="8bc474d93b21ce189f0098d126a7a7ebe84292f291fbcbead88fd70885535565" Feb 26 19:54:36 crc kubenswrapper[4722]: E0226 19:54:36.529817 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:54:36 crc kubenswrapper[4722]: I0226 19:54:36.531471 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:37 crc kubenswrapper[4722]: I0226 19:54:37.035228 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 19:54:37 crc kubenswrapper[4722]: I0226 19:54:37.035332 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 19:54:37 crc kubenswrapper[4722]: W0226 19:54:37.067421 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:37Z is after 2026-02-23T05:33:13Z Feb 26 19:54:37 crc kubenswrapper[4722]: E0226 19:54:37.067541 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 19:54:37 crc kubenswrapper[4722]: I0226 19:54:37.075821 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:37Z is after 2026-02-23T05:33:13Z Feb 26 19:54:37 crc kubenswrapper[4722]: I0226 19:54:37.252677 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:37 crc kubenswrapper[4722]: I0226 19:54:37.253852 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:37 crc kubenswrapper[4722]: I0226 19:54:37.253890 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:37 crc kubenswrapper[4722]: I0226 19:54:37.253900 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:37 crc kubenswrapper[4722]: I0226 19:54:37.254474 4722 scope.go:117] "RemoveContainer" containerID="8bc474d93b21ce189f0098d126a7a7ebe84292f291fbcbead88fd70885535565" Feb 26 19:54:37 crc kubenswrapper[4722]: E0226 19:54:37.254640 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:54:38 crc kubenswrapper[4722]: I0226 19:54:38.053780 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:38 crc kubenswrapper[4722]: I0226 19:54:38.054850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:38 crc kubenswrapper[4722]: I0226 19:54:38.054881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:38 crc kubenswrapper[4722]: I0226 19:54:38.054890 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:38 crc kubenswrapper[4722]: I0226 19:54:38.054908 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:54:38 crc kubenswrapper[4722]: E0226 19:54:38.059671 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:38Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 19:54:38 crc kubenswrapper[4722]: E0226 19:54:38.072305 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:38Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 26 19:54:38 crc kubenswrapper[4722]: I0226 19:54:38.078026 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:38Z is after 2026-02-23T05:33:13Z Feb 26 19:54:38 crc kubenswrapper[4722]: E0226 19:54:38.211495 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:54:38 crc kubenswrapper[4722]: I0226 19:54:38.655914 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 26 19:54:38 crc kubenswrapper[4722]: I0226 19:54:38.656176 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:38 crc kubenswrapper[4722]: I0226 19:54:38.658271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:38 crc kubenswrapper[4722]: I0226 19:54:38.658328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:38 crc kubenswrapper[4722]: I0226 19:54:38.658340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:38 crc kubenswrapper[4722]: I0226 19:54:38.674368 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 26 19:54:39 crc kubenswrapper[4722]: I0226 19:54:39.077666 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:39Z is after 2026-02-23T05:33:13Z Feb 26 19:54:39 crc kubenswrapper[4722]: I0226 19:54:39.258473 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:39 crc kubenswrapper[4722]: I0226 19:54:39.259964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:39 crc kubenswrapper[4722]: I0226 19:54:39.260054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:39 crc kubenswrapper[4722]: I0226 19:54:39.260076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:40 crc kubenswrapper[4722]: I0226 19:54:40.077876 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:40Z is after 2026-02-23T05:33:13Z Feb 26 19:54:40 crc kubenswrapper[4722]: I0226 19:54:40.187996 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 19:54:40 crc kubenswrapper[4722]: I0226 19:54:40.188755 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:40 crc kubenswrapper[4722]: I0226 19:54:40.189050 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:40 crc kubenswrapper[4722]: I0226 19:54:40.190606 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:40 crc kubenswrapper[4722]: I0226 19:54:40.190666 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:40 crc kubenswrapper[4722]: I0226 19:54:40.190691 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:40 crc kubenswrapper[4722]: I0226 19:54:40.191638 4722 scope.go:117] "RemoveContainer" containerID="8bc474d93b21ce189f0098d126a7a7ebe84292f291fbcbead88fd70885535565" Feb 26 19:54:40 crc kubenswrapper[4722]: E0226 19:54:40.191926 4722 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 19:54:40 crc kubenswrapper[4722]: E0226 19:54:40.192013 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:54:41 crc kubenswrapper[4722]: I0226 19:54:41.078334 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:41Z is after 2026-02-23T05:33:13Z Feb 26 19:54:41 crc kubenswrapper[4722]: I0226 19:54:41.449038 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:54:41 crc kubenswrapper[4722]: I0226 19:54:41.449230 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:41 crc kubenswrapper[4722]: I0226 19:54:41.450607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:41 crc kubenswrapper[4722]: I0226 19:54:41.450631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:41 crc kubenswrapper[4722]: I0226 19:54:41.450640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:41 crc kubenswrapper[4722]: I0226 19:54:41.451055 4722 scope.go:117] "RemoveContainer" containerID="8bc474d93b21ce189f0098d126a7a7ebe84292f291fbcbead88fd70885535565" Feb 26 19:54:41 crc kubenswrapper[4722]: E0226 19:54:41.451218 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:54:41 crc kubenswrapper[4722]: E0226 19:54:41.656083 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:41Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897e3fde933e5d3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.072303059 +0000 UTC m=+0.609271013,LastTimestamp:2026-02-26 19:54:18.072303059 +0000 UTC m=+0.609271013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:54:42 crc kubenswrapper[4722]: I0226 19:54:42.078728 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:42Z is after 2026-02-23T05:33:13Z Feb 26 19:54:42 crc kubenswrapper[4722]: W0226 19:54:42.599767 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:42Z is after 2026-02-23T05:33:13Z Feb 26 19:54:42 crc kubenswrapper[4722]: E0226 19:54:42.599947 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 19:54:43 crc kubenswrapper[4722]: I0226 19:54:43.078375 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:43Z is after 2026-02-23T05:33:13Z Feb 26 19:54:44 crc kubenswrapper[4722]: I0226 19:54:44.078247 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:44Z is after 2026-02-23T05:33:13Z Feb 26 19:54:44 crc kubenswrapper[4722]: W0226 19:54:44.197512 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:44Z is after 2026-02-23T05:33:13Z Feb 26 19:54:44 crc kubenswrapper[4722]: E0226 19:54:44.197588 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 19:54:45 crc kubenswrapper[4722]: I0226 19:54:45.060083 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:45 crc kubenswrapper[4722]: I0226 19:54:45.061430 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:45 crc kubenswrapper[4722]: I0226 19:54:45.061476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:45 crc kubenswrapper[4722]: I0226 19:54:45.061495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:45 crc kubenswrapper[4722]: I0226 19:54:45.061528 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:54:45 crc kubenswrapper[4722]: E0226 19:54:45.065125 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:45Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 19:54:45 crc kubenswrapper[4722]: E0226 19:54:45.076539 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:45Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 26 19:54:45 crc kubenswrapper[4722]: I0226 19:54:45.076871 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:45Z is after 2026-02-23T05:33:13Z Feb 26 19:54:45 crc kubenswrapper[4722]: W0226 19:54:45.307773 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:45Z is after 2026-02-23T05:33:13Z Feb 26 19:54:45 crc kubenswrapper[4722]: E0226 19:54:45.307870 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 19:54:46 crc kubenswrapper[4722]: I0226 19:54:46.076186 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z Feb 26 19:54:46 crc kubenswrapper[4722]: W0226 19:54:46.427531 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z Feb 26 19:54:46 crc kubenswrapper[4722]: E0226 19:54:46.427632 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.036211 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.036275 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.036328 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.036484 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.037473 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.037510 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.037520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.038033 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.038202 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a" gracePeriod=30 Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.075738 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:47Z is after 2026-02-23T05:33:13Z Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.281489 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.281910 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a" exitCode=255 Feb 26 19:54:47 crc kubenswrapper[4722]: I0226 19:54:47.281947 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a"} Feb 26 19:54:48 crc kubenswrapper[4722]: I0226 19:54:48.080002 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:48Z is after 2026-02-23T05:33:13Z Feb 26 19:54:48 crc kubenswrapper[4722]: E0226 19:54:48.211797 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:54:48 crc kubenswrapper[4722]: I0226 19:54:48.288203 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 19:54:48 crc kubenswrapper[4722]: I0226 19:54:48.289123 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c"} Feb 26 19:54:48 crc kubenswrapper[4722]: I0226 19:54:48.289324 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:48 crc kubenswrapper[4722]: I0226 19:54:48.291254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:48 crc kubenswrapper[4722]: I0226 19:54:48.291337 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:48 crc kubenswrapper[4722]: I0226 19:54:48.291358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:49 crc kubenswrapper[4722]: I0226 19:54:49.076411 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:49Z is after 2026-02-23T05:33:13Z Feb 26 19:54:49 crc kubenswrapper[4722]: I0226 19:54:49.292230 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:49 crc kubenswrapper[4722]: I0226 19:54:49.293497 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:49 crc kubenswrapper[4722]: I0226 19:54:49.293637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:49 crc kubenswrapper[4722]: I0226 19:54:49.293747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:50 crc kubenswrapper[4722]: I0226 19:54:50.078829 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:50Z is after 2026-02-23T05:33:13Z Feb 26 19:54:51 crc kubenswrapper[4722]: I0226 19:54:51.078308 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:51Z is after 2026-02-23T05:33:13Z Feb 26 19:54:51 crc kubenswrapper[4722]: E0226 19:54:51.659499 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:51Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897e3fde933e5d3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.072303059 +0000 UTC m=+0.609271013,LastTimestamp:2026-02-26 19:54:18.072303059 +0000 UTC m=+0.609271013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:54:52 crc kubenswrapper[4722]: I0226 19:54:52.066045 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:52 crc kubenswrapper[4722]: I0226 19:54:52.067406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:52 crc kubenswrapper[4722]: I0226 19:54:52.067453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:52 crc kubenswrapper[4722]: I0226 19:54:52.067471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:52 crc kubenswrapper[4722]: I0226 19:54:52.067503 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:54:52 crc kubenswrapper[4722]: E0226 19:54:52.070978 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:52Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 19:54:52 crc kubenswrapper[4722]: I0226 19:54:52.076295 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:52Z is after 2026-02-23T05:33:13Z Feb 26 19:54:52 crc kubenswrapper[4722]: E0226 19:54:52.080452 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:52Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 26 19:54:52 crc kubenswrapper[4722]: I0226 19:54:52.145724 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:52 crc kubenswrapper[4722]: I0226 19:54:52.147103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:52 crc kubenswrapper[4722]: I0226 19:54:52.147152 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:52 crc kubenswrapper[4722]: I0226 19:54:52.147162 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:52 crc kubenswrapper[4722]: I0226 19:54:52.147612 4722 scope.go:117] "RemoveContainer" containerID="8bc474d93b21ce189f0098d126a7a7ebe84292f291fbcbead88fd70885535565" Feb 26 19:54:53 crc kubenswrapper[4722]: I0226 19:54:53.076327 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:53Z is after 2026-02-23T05:33:13Z Feb 26 19:54:53 crc kubenswrapper[4722]: I0226 19:54:53.306337 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 19:54:53 crc kubenswrapper[4722]: I0226 19:54:53.307558 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 19:54:53 crc kubenswrapper[4722]: I0226 19:54:53.309117 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="18b1961e72ac3bfb0cc799b9cd96863db98507ca7ea7f9fed6f87f349c1d8e57" exitCode=255 Feb 26 19:54:53 crc kubenswrapper[4722]: I0226 19:54:53.309169 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"18b1961e72ac3bfb0cc799b9cd96863db98507ca7ea7f9fed6f87f349c1d8e57"} Feb 26 19:54:53 crc kubenswrapper[4722]: I0226 19:54:53.309203 4722 scope.go:117] "RemoveContainer" containerID="8bc474d93b21ce189f0098d126a7a7ebe84292f291fbcbead88fd70885535565" Feb 26 19:54:53 crc kubenswrapper[4722]: I0226 19:54:53.309389 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:53 crc kubenswrapper[4722]: I0226 19:54:53.310714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:53 crc kubenswrapper[4722]: I0226 19:54:53.310748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:53 crc kubenswrapper[4722]: I0226 19:54:53.310759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:53 crc kubenswrapper[4722]: I0226 19:54:53.311312 4722 scope.go:117] "RemoveContainer" containerID="18b1961e72ac3bfb0cc799b9cd96863db98507ca7ea7f9fed6f87f349c1d8e57" Feb 26 19:54:53 crc kubenswrapper[4722]: E0226 19:54:53.311487 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:54:54 crc kubenswrapper[4722]: I0226 19:54:54.034867 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:54 crc kubenswrapper[4722]: I0226 19:54:54.035155 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:54 crc kubenswrapper[4722]: I0226 19:54:54.036333 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:54 crc kubenswrapper[4722]: I0226 19:54:54.036398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:54 crc kubenswrapper[4722]: I0226 19:54:54.036416 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:54 crc kubenswrapper[4722]: I0226 19:54:54.076536 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:54Z is after 2026-02-23T05:33:13Z Feb 26 19:54:54 crc kubenswrapper[4722]: I0226 19:54:54.315304 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 19:54:55 crc kubenswrapper[4722]: I0226 19:54:55.078252 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:55Z is after 2026-02-23T05:33:13Z Feb 26 19:54:56 crc kubenswrapper[4722]: I0226 19:54:56.076096 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:56Z is after 2026-02-23T05:33:13Z Feb 26 19:54:56 crc kubenswrapper[4722]: I0226 19:54:56.712887 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:54:56 crc kubenswrapper[4722]: I0226 19:54:56.713410 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:56 crc kubenswrapper[4722]: I0226 19:54:56.715021 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:56 crc kubenswrapper[4722]: I0226 19:54:56.715093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:56 crc kubenswrapper[4722]: I0226 19:54:56.715112 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:57 crc kubenswrapper[4722]: I0226 19:54:57.035259 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 19:54:57 crc kubenswrapper[4722]: I0226 19:54:57.035387 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 19:54:57 crc kubenswrapper[4722]: I0226 19:54:57.079958 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:54:57 crc kubenswrapper[4722]: I0226 19:54:57.440095 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 19:54:57 crc kubenswrapper[4722]: I0226 19:54:57.459994 4722 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 26 19:54:58 crc kubenswrapper[4722]: I0226 19:54:58.079813 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:54:58 crc kubenswrapper[4722]: E0226 19:54:58.212581 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:54:59 crc kubenswrapper[4722]: I0226 19:54:59.071385 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:54:59 crc kubenswrapper[4722]: I0226 19:54:59.073223 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:54:59 crc kubenswrapper[4722]: I0226 19:54:59.073291 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:54:59 crc kubenswrapper[4722]: I0226 19:54:59.073310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:54:59 crc kubenswrapper[4722]: I0226 19:54:59.073346 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:54:59 crc kubenswrapper[4722]: E0226 19:54:59.078674 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 19:54:59 crc kubenswrapper[4722]: I0226 19:54:59.078721 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:54:59 crc kubenswrapper[4722]: E0226 19:54:59.083164 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 19:54:59 crc kubenswrapper[4722]: W0226 19:54:59.291623 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 26 19:54:59 crc kubenswrapper[4722]: E0226 19:54:59.291733 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 26 19:55:00 crc kubenswrapper[4722]: I0226 19:55:00.079938 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:00 crc kubenswrapper[4722]: I0226 19:55:00.189246 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:55:00 crc kubenswrapper[4722]: I0226 19:55:00.189460 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:00 crc kubenswrapper[4722]: I0226 19:55:00.190961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:00 crc kubenswrapper[4722]: I0226 19:55:00.191036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:00 crc kubenswrapper[4722]: I0226 19:55:00.191066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:00 crc kubenswrapper[4722]: I0226 19:55:00.191855 4722 scope.go:117] "RemoveContainer" containerID="18b1961e72ac3bfb0cc799b9cd96863db98507ca7ea7f9fed6f87f349c1d8e57" Feb 26 19:55:00 crc kubenswrapper[4722]: E0226 19:55:00.192128 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:55:00 crc kubenswrapper[4722]: W0226 19:55:00.412515 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 26 19:55:00 crc kubenswrapper[4722]: E0226 19:55:00.412602 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 26 19:55:01 crc kubenswrapper[4722]: I0226 19:55:01.082889 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:01 crc kubenswrapper[4722]: W0226 19:55:01.083806 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.083873 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 26 19:55:01 crc kubenswrapper[4722]: I0226 19:55:01.449258 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:55:01 crc kubenswrapper[4722]: I0226 19:55:01.449456 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:01 crc kubenswrapper[4722]: I0226 19:55:01.450872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:01 crc kubenswrapper[4722]: I0226 19:55:01.450936 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:01 crc kubenswrapper[4722]: I0226 19:55:01.450959 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:01 crc kubenswrapper[4722]: I0226 19:55:01.451819 4722 scope.go:117] "RemoveContainer" containerID="18b1961e72ac3bfb0cc799b9cd96863db98507ca7ea7f9fed6f87f349c1d8e57" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.452102 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.668627 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fde933e5d3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.072303059 +0000 UTC m=+0.609271013,LastTimestamp:2026-02-26 19:54:18.072303059 +0000 UTC m=+0.609271013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.677080 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded317f9a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139254682 +0000 UTC m=+0.676222616,LastTimestamp:2026-02-26 19:54:18.139254682 +0000 UTC m=+0.676222616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.685178 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31cb77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139274103 +0000 UTC m=+0.676242037,LastTimestamp:2026-02-26 19:54:18.139274103 +0000 UTC m=+0.676242037,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.692474 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31f683 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139285123 +0000 UTC m=+0.676253057,LastTimestamp:2026-02-26 19:54:18.139285123 +0000 UTC m=+0.676253057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.699610 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fdf14e9292 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.208268946 +0000 UTC m=+0.745236860,LastTimestamp:2026-02-26 19:54:18.208268946 +0000 UTC m=+0.745236860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.707404 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded317f9a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded317f9a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139254682 +0000 UTC m=+0.676222616,LastTimestamp:2026-02-26 19:54:18.24752925 +0000 UTC m=+0.784497174,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.714642 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31cb77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31cb77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139274103 +0000 UTC m=+0.676242037,LastTimestamp:2026-02-26 19:54:18.24754932 +0000 UTC m=+0.784517244,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.722168 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31f683\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31f683 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139285123 +0000 UTC m=+0.676253057,LastTimestamp:2026-02-26 19:54:18.2475728 +0000 UTC m=+0.784540724,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.729483 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded317f9a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded317f9a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139254682 +0000 UTC m=+0.676222616,LastTimestamp:2026-02-26 19:54:18.24886125 +0000 UTC m=+0.785829184,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.739099 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31cb77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31cb77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139274103 +0000 UTC m=+0.676242037,LastTimestamp:2026-02-26 19:54:18.24889046 +0000 UTC m=+0.785858394,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.747228 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31f683\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31f683 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139285123 +0000 UTC m=+0.676253057,LastTimestamp:2026-02-26 19:54:18.24890288 +0000 UTC m=+0.785870814,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.755272 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded317f9a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded317f9a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139254682 +0000 UTC m=+0.676222616,LastTimestamp:2026-02-26 19:54:18.250363902 +0000 UTC m=+0.787331826,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.759299 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31cb77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31cb77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139274103 +0000 UTC m=+0.676242037,LastTimestamp:2026-02-26 19:54:18.250395622 +0000 UTC m=+0.787363546,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.763532 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31f683\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31f683 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139285123 +0000 UTC m=+0.676253057,LastTimestamp:2026-02-26 19:54:18.250404533 +0000 UTC m=+0.787372447,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.770052 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded317f9a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded317f9a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139254682 +0000 UTC m=+0.676222616,LastTimestamp:2026-02-26 19:54:18.250827976 +0000 UTC m=+0.787795910,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.777313 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31cb77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31cb77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139274103 +0000 UTC m=+0.676242037,LastTimestamp:2026-02-26 19:54:18.250850116 +0000 UTC m=+0.787818050,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.784170 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31f683\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31f683 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139285123 +0000 UTC m=+0.676253057,LastTimestamp:2026-02-26 19:54:18.250860736 +0000 UTC m=+0.787828680,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.790855 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded317f9a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded317f9a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139254682 +0000 UTC m=+0.676222616,LastTimestamp:2026-02-26 19:54:18.251296099 +0000 UTC m=+0.788264023,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.798758 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31cb77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31cb77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139274103 +0000 UTC m=+0.676242037,LastTimestamp:2026-02-26 19:54:18.251305659 +0000 UTC m=+0.788273583,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.805174 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31f683\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31f683 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139285123 +0000 UTC m=+0.676253057,LastTimestamp:2026-02-26 19:54:18.251313119 +0000 UTC m=+0.788281043,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.812303 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded317f9a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded317f9a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139254682 +0000 UTC m=+0.676222616,LastTimestamp:2026-02-26 19:54:18.251476271 +0000 UTC m=+0.788444235,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.819011 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31cb77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31cb77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139274103 +0000 UTC m=+0.676242037,LastTimestamp:2026-02-26 19:54:18.251504801 +0000 UTC m=+0.788472765,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.826320 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31f683\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31f683 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139285123 +0000 UTC m=+0.676253057,LastTimestamp:2026-02-26 19:54:18.251523381 +0000 UTC m=+0.788491345,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.833782 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded317f9a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded317f9a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139254682 +0000 UTC m=+0.676222616,LastTimestamp:2026-02-26 19:54:18.251808753 +0000 UTC m=+0.788776687,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.842538 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897e3fded31cb77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897e3fded31cb77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.139274103 +0000 UTC m=+0.676242037,LastTimestamp:2026-02-26 19:54:18.251821903 +0000 UTC m=+0.788789837,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.851424 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897e3fe0ba37405 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.650039301 +0000 UTC m=+1.187007245,LastTimestamp:2026-02-26 19:54:18.650039301 +0000 UTC m=+1.187007245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.858976 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897e3fe0c422b52 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.660440914 +0000 UTC m=+1.197408848,LastTimestamp:2026-02-26 19:54:18.660440914 +0000 UTC m=+1.197408848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.864057 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fe0cdade57 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.670448215 +0000 UTC m=+1.207416139,LastTimestamp:2026-02-26 19:54:18.670448215 +0000 UTC m=+1.207416139,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.870781 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe0eac2504 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.700940548 +0000 UTC m=+1.237908492,LastTimestamp:2026-02-26 19:54:18.700940548 +0000 UTC m=+1.237908492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.875175 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe0f504c3d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:18.711698493 +0000 UTC m=+1.248666407,LastTimestamp:2026-02-26 19:54:18.711698493 +0000 UTC m=+1.248666407,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.879422 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fe2aef8521 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.175118113 +0000 UTC m=+1.712086037,LastTimestamp:2026-02-26 19:54:19.175118113 +0000 UTC m=+1.712086037,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.884103 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897e3fe2aef891d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.175119133 +0000 UTC m=+1.712087047,LastTimestamp:2026-02-26 19:54:19.175119133 +0000 UTC m=+1.712087047,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.890483 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe2af03363 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.175162723 +0000 UTC m=+1.712130647,LastTimestamp:2026-02-26 19:54:19.175162723 +0000 UTC m=+1.712130647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.895238 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897e3fe2af0be37 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.175198263 +0000 UTC m=+1.712166187,LastTimestamp:2026-02-26 19:54:19.175198263 +0000 UTC m=+1.712166187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.899501 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe2af47ad3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.175443155 +0000 UTC m=+1.712411099,LastTimestamp:2026-02-26 19:54:19.175443155 +0000 UTC m=+1.712411099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.903969 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897e3fe2b6717f5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.182954485 +0000 UTC m=+1.719922409,LastTimestamp:2026-02-26 19:54:19.182954485 +0000 UTC m=+1.719922409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.908933 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe2b791300 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.184132864 +0000 UTC m=+1.721100788,LastTimestamp:2026-02-26 19:54:19.184132864 +0000 UTC m=+1.721100788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.914734 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe2b88f598 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.185173912 +0000 UTC m=+1.722141836,LastTimestamp:2026-02-26 19:54:19.185173912 +0000 UTC m=+1.722141836,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.921254 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fe2bb25958 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.187886424 +0000 UTC m=+1.724854348,LastTimestamp:2026-02-26 19:54:19.187886424 +0000 UTC m=+1.724854348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.925525 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897e3fe2c0002f3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.192976115 +0000 UTC m=+1.729944039,LastTimestamp:2026-02-26 19:54:19.192976115 +0000 UTC m=+1.729944039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.930577 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe2c003845 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.192989765 +0000 UTC m=+1.729957689,LastTimestamp:2026-02-26 19:54:19.192989765 +0000 UTC m=+1.729957689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.935050 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe3dcbb55c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.491538268 +0000 UTC m=+2.028506192,LastTimestamp:2026-02-26 19:54:19.491538268 +0000 UTC m=+2.028506192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.941403 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe3e9c118d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.505193357 +0000 UTC m=+2.042161281,LastTimestamp:2026-02-26 19:54:19.505193357 +0000 UTC m=+2.042161281,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.947338 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe3eaccf99 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.506290585 +0000 UTC m=+2.043258540,LastTimestamp:2026-02-26 19:54:19.506290585 +0000 UTC m=+2.043258540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.951548 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe49a33702 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.690211074 +0000 UTC m=+2.227178998,LastTimestamp:2026-02-26 19:54:19.690211074 +0000 UTC m=+2.227178998,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.955690 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe4a4572a0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.700843168 +0000 UTC m=+2.237811102,LastTimestamp:2026-02-26 19:54:19.700843168 +0000 UTC m=+2.237811102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.960110 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe4a5404b2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.701798066 +0000 UTC m=+2.238765990,LastTimestamp:2026-02-26 19:54:19.701798066 +0000 UTC m=+2.238765990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.966580 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe55263297 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.883344535 +0000 UTC m=+2.420312459,LastTimestamp:2026-02-26 19:54:19.883344535 +0000 UTC m=+2.420312459,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.971172 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe561b53dd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.899409373 +0000 UTC m=+2.436377347,LastTimestamp:2026-02-26 19:54:19.899409373 +0000 UTC m=+2.436377347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.976962 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe660b1d13 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.166782227 +0000 UTC m=+2.703750171,LastTimestamp:2026-02-26 19:54:20.166782227 +0000 UTC m=+2.703750171,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.983975 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fe6632ce36 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.169383478 +0000 UTC m=+2.706351412,LastTimestamp:2026-02-26 19:54:20.169383478 +0000 UTC m=+2.706351412,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.987893 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897e3fe66407c8b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.170280075 +0000 UTC m=+2.707248039,LastTimestamp:2026-02-26 19:54:20.170280075 +0000 UTC m=+2.707248039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.992943 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897e3fe66957073 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.175847539 +0000 UTC m=+2.712815483,LastTimestamp:2026-02-26 19:54:20.175847539 +0000 UTC m=+2.712815483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:01 crc kubenswrapper[4722]: E0226 19:55:01.997373 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897e3fe71d965a7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.364850599 +0000 UTC m=+2.901818523,LastTimestamp:2026-02-26 19:54:20.364850599 +0000 UTC m=+2.901818523,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.001284 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe720f298b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.368374155 +0000 UTC m=+2.905342079,LastTimestamp:2026-02-26 19:54:20.368374155 +0000 UTC m=+2.905342079,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.006699 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fe7220a039 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.369518649 +0000 UTC m=+2.906486573,LastTimestamp:2026-02-26 19:54:20.369518649 +0000 UTC m=+2.906486573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.012019 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897e3fe7236348f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.370932879 +0000 UTC m=+2.907900803,LastTimestamp:2026-02-26 19:54:20.370932879 +0000 UTC m=+2.907900803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.017377 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897e3fe72688da0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.37423248 +0000 UTC m=+2.911200404,LastTimestamp:2026-02-26 19:54:20.37423248 +0000 UTC m=+2.911200404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.023180 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe72c97543 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.380583235 +0000 UTC m=+2.917551159,LastTimestamp:2026-02-26 19:54:20.380583235 +0000 UTC m=+2.917551159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.028727 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe72d570ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.381368492 +0000 UTC m=+2.918336416,LastTimestamp:2026-02-26 19:54:20.381368492 +0000 UTC m=+2.918336416,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.034588 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897e3fe72e8dd3d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.382641469 +0000 UTC m=+2.919609393,LastTimestamp:2026-02-26 19:54:20.382641469 +0000 UTC m=+2.919609393,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.040120 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897e3fe72f18e49 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.383211081 +0000 UTC m=+2.920179005,LastTimestamp:2026-02-26 19:54:20.383211081 +0000 UTC m=+2.920179005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.043933 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fe75bbb393 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.430013331 +0000 UTC m=+2.966981255,LastTimestamp:2026-02-26 19:54:20.430013331 +0000 UTC m=+2.966981255,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.047783 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe7e4eb165 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.573864293 +0000 UTC m=+3.110832217,LastTimestamp:2026-02-26 19:54:20.573864293 +0000 UTC m=+3.110832217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.051423 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897e3fe7e695b49 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.575611721 +0000 UTC m=+3.112579645,LastTimestamp:2026-02-26 19:54:20.575611721 +0000 UTC m=+3.112579645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.055762 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe7ee42c1d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.583660573 +0000 UTC m=+3.120628497,LastTimestamp:2026-02-26 19:54:20.583660573 +0000 UTC m=+3.120628497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.059513 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe7efc62cc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.585247436 +0000 UTC m=+3.122215360,LastTimestamp:2026-02-26 19:54:20.585247436 +0000 UTC m=+3.122215360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.063453 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897e3fe7f574d29 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.591205673 +0000 UTC m=+3.128173597,LastTimestamp:2026-02-26 19:54:20.591205673 +0000 UTC m=+3.128173597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.065394 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897e3fe7f77ca7d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.593334909 +0000 UTC m=+3.130302833,LastTimestamp:2026-02-26 19:54:20.593334909 +0000 UTC m=+3.130302833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.067724 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897e3fe891036a0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.754319008 +0000 UTC m=+3.291286972,LastTimestamp:2026-02-26 19:54:20.754319008 +0000 UTC m=+3.291286972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.068904 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe89359e2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.75677035 +0000 UTC m=+3.293738274,LastTimestamp:2026-02-26 19:54:20.75677035 +0000 UTC m=+3.293738274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.071401 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897e3fe89f091b8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.769022392 +0000 UTC m=+3.305990316,LastTimestamp:2026-02-26 19:54:20.769022392 +0000 UTC m=+3.305990316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: I0226 19:55:02.074959 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.075067 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe8a334b5f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.773395295 +0000 UTC m=+3.310363219,LastTimestamp:2026-02-26 19:54:20.773395295 +0000 UTC m=+3.310363219,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.078350 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe8a450f7a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.77455961 +0000 UTC m=+3.311527534,LastTimestamp:2026-02-26 19:54:20.77455961 +0000 UTC m=+3.311527534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.081426 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe951626ec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.956034796 +0000 UTC m=+3.493002720,LastTimestamp:2026-02-26 19:54:20.956034796 +0000 UTC m=+3.493002720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.084445 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe95c3da70 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.96741848 +0000 UTC m=+3.504386404,LastTimestamp:2026-02-26 19:54:20.96741848 +0000 UTC m=+3.504386404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.087490 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fe95d1ba63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:20.968327779 +0000 UTC m=+3.505295703,LastTimestamp:2026-02-26 19:54:20.968327779 +0000 UTC m=+3.505295703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.091322 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fea0431e0c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:21.14353102 +0000 UTC m=+3.680498944,LastTimestamp:2026-02-26 19:54:21.14353102 +0000 UTC m=+3.680498944,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.094736 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3fea0f76d8d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:21.155347853 +0000 UTC m=+3.692315777,LastTimestamp:2026-02-26 19:54:21.155347853 +0000 UTC m=+3.692315777,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.098420 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fea250a009 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:21.177970697 +0000 UTC m=+3.714938641,LastTimestamp:2026-02-26 19:54:21.177970697 +0000 UTC m=+3.714938641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.102342 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3feae8c5f50 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:21.38321288 +0000 UTC m=+3.920180804,LastTimestamp:2026-02-26 19:54:21.38321288 +0000 UTC m=+3.920180804,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.105264 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3feaf3a7918 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:21.394622744 +0000 UTC m=+3.931590668,LastTimestamp:2026-02-26 19:54:21.394622744 +0000 UTC m=+3.931590668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.109014 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-apiserver-crc.1897e3feb6d7159e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:6443/livez": dial tcp 192.168.126.11:6443: connect: connection refused Feb 26 19:55:02 crc kubenswrapper[4722]: body: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:21.522326942 +0000 UTC m=+4.059294866,LastTimestamp:2026-02-26 19:54:21.522326942 +0000 UTC m=+4.059294866,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.115272 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3feb6d79ec4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:21.522362052 +0000 UTC m=+4.059329976,LastTimestamp:2026-02-26 19:54:21.522362052 +0000 UTC m=+4.059329976,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.119915 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fedeeaa8b6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.194698422 +0000 UTC m=+4.731666376,LastTimestamp:2026-02-26 19:54:22.194698422 +0000 UTC m=+4.731666376,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.124970 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3feeb24f8a4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.399846564 +0000 UTC m=+4.936814498,LastTimestamp:2026-02-26 19:54:22.399846564 +0000 UTC m=+4.936814498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.128868 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3feebc2cb7f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.410189695 +0000 UTC m=+4.947157629,LastTimestamp:2026-02-26 19:54:22.410189695 +0000 UTC m=+4.947157629,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.132154 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3feebd96502 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.411670786 +0000 UTC m=+4.948638720,LastTimestamp:2026-02-26 19:54:22.411670786 +0000 UTC m=+4.948638720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.136309 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fefadb4b40 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.663453504 +0000 UTC m=+5.200421438,LastTimestamp:2026-02-26 19:54:22.663453504 +0000 UTC m=+5.200421438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.139992 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fefb9caa39 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.676126265 +0000 UTC m=+5.213094199,LastTimestamp:2026-02-26 19:54:22.676126265 +0000 UTC m=+5.213094199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.143962 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fefba9b0dd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.676979933 +0000 UTC m=+5.213947877,LastTimestamp:2026-02-26 19:54:22.676979933 +0000 UTC m=+5.213947877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.147498 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff0746b7d4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.871820244 +0000 UTC m=+5.408788208,LastTimestamp:2026-02-26 19:54:22.871820244 +0000 UTC m=+5.408788208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.150726 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff07e34fae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.882082734 +0000 UTC m=+5.419050698,LastTimestamp:2026-02-26 19:54:22.882082734 +0000 UTC m=+5.419050698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.153950 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff07f1fd9c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.883044764 +0000 UTC m=+5.420012728,LastTimestamp:2026-02-26 19:54:22.883044764 +0000 UTC m=+5.420012728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.157396 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff15dea0c2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:23.116656834 +0000 UTC m=+5.653624748,LastTimestamp:2026-02-26 19:54:23.116656834 +0000 UTC m=+5.653624748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.160752 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff167922ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:23.12678267 +0000 UTC m=+5.663750594,LastTimestamp:2026-02-26 19:54:23.12678267 +0000 UTC m=+5.663750594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.164241 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff1686acb3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:23.127669939 +0000 UTC m=+5.664637873,LastTimestamp:2026-02-26 19:54:23.127669939 +0000 UTC m=+5.664637873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.167918 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff1f7fbbb6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:23.278209974 +0000 UTC m=+5.815177898,LastTimestamp:2026-02-26 19:54:23.278209974 +0000 UTC m=+5.815177898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.171118 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff2004d163 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:23.286931811 +0000 UTC m=+5.823899745,LastTimestamp:2026-02-26 19:54:23.286931811 +0000 UTC m=+5.823899745,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.175441 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.1897e3ffff6ea36b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Feb 26 19:55:02 crc kubenswrapper[4722]: body: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:27.035186027 +0000 UTC m=+9.572153961,LastTimestamp:2026-02-26 19:54:27.035186027 +0000 UTC m=+9.572153961,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.178749 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3ffff6f5bea openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:27.035233258 +0000 UTC m=+9.572201192,LastTimestamp:2026-02-26 19:54:27.035233258 +0000 UTC m=+9.572201192,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.182941 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-apiserver-crc.1897e401068bec03 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Feb 26 19:55:02 crc kubenswrapper[4722]: body: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:31.449512963 +0000 UTC m=+13.986480887,LastTimestamp:2026-02-26 19:54:31.449512963 +0000 UTC m=+13.986480887,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.186562 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e401068ccba6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:31.449570214 +0000 UTC m=+13.986538138,LastTimestamp:2026-02-26 19:54:31.449570214 +0000 UTC m=+13.986538138,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.190273 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-apiserver-crc.1897e401133233f8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 26 19:55:02 crc kubenswrapper[4722]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 19:55:02 crc kubenswrapper[4722]: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:31.661736952 +0000 UTC m=+14.198704876,LastTimestamp:2026-02-26 19:54:31.661736952 +0000 UTC m=+14.198704876,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.194091 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e4011332c82d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:31.661774893 +0000 UTC m=+14.198742817,LastTimestamp:2026-02-26 19:54:31.661774893 +0000 UTC m=+14.198742817,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.198242 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897e401133233f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-apiserver-crc.1897e401133233f8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 26 19:55:02 crc kubenswrapper[4722]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 19:55:02 crc kubenswrapper[4722]: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:31.661736952 +0000 UTC m=+14.198704876,LastTimestamp:2026-02-26 19:54:31.667169249 +0000 UTC m=+14.204137193,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.203848 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.1897e402537c37a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 19:55:02 crc kubenswrapper[4722]: body: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:37.035296678 +0000 UTC m=+19.572264632,LastTimestamp:2026-02-26 19:54:37.035296678 +0000 UTC m=+19.572264632,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.207431 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e402537d773b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:37.035378491 +0000 UTC m=+19.572346445,LastTimestamp:2026-02-26 19:54:37.035378491 +0000 UTC m=+19.572346445,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.213856 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e402537c37a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.1897e402537c37a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 19:55:02 crc kubenswrapper[4722]: body: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:37.035296678 +0000 UTC m=+19.572264632,LastTimestamp:2026-02-26 19:54:47.036259024 +0000 UTC m=+29.573226948,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.218066 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e402537d773b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e402537d773b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:37.035378491 +0000 UTC m=+19.572346445,LastTimestamp:2026-02-26 19:54:47.036301245 +0000 UTC m=+29.573269159,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.222751 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e404a7b43889 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:47.038187657 +0000 UTC m=+29.575155591,LastTimestamp:2026-02-26 19:54:47.038187657 +0000 UTC m=+29.575155591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.227490 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e3fe2b88f598\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe2b88f598 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.185173912 +0000 UTC m=+1.722141836,LastTimestamp:2026-02-26 19:54:47.155829339 +0000 UTC m=+29.692797263,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.232720 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e3fe3dcbb55c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe3dcbb55c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.491538268 +0000 UTC m=+2.028506192,LastTimestamp:2026-02-26 19:54:47.322861812 +0000 UTC m=+29.859829746,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.237015 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e3fe3e9c118d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe3e9c118d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.505193357 +0000 UTC m=+2.042161281,LastTimestamp:2026-02-26 19:54:47.333087529 +0000 UTC m=+29.870055473,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.244501 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e402537c37a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.1897e402537c37a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 19:55:02 crc kubenswrapper[4722]: body: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:37.035296678 +0000 UTC m=+19.572264632,LastTimestamp:2026-02-26 19:54:57.03532638 +0000 UTC m=+39.572294334,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.249511 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e402537d773b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e402537d773b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:37.035378491 +0000 UTC m=+19.572346445,LastTimestamp:2026-02-26 19:54:57.035425312 +0000 UTC m=+39.572393266,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:03 crc kubenswrapper[4722]: I0226 19:55:03.079685 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:03 crc kubenswrapper[4722]: W0226 19:55:03.135161 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 26 19:55:03 crc kubenswrapper[4722]: E0226 19:55:03.135214 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 26 19:55:04 crc kubenswrapper[4722]: I0226 19:55:04.079990 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.082024 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.720176 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.720505 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.722232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.722298 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.722317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.728486 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.079850 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.080012 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.081441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.081490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.081511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.081547 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:55:06 crc kubenswrapper[4722]: E0226 19:55:06.088837 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 19:55:06 crc kubenswrapper[4722]: E0226 19:55:06.089300 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.351585 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.352584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.352645 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.352665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:07 crc kubenswrapper[4722]: I0226 19:55:07.073649 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:08 crc kubenswrapper[4722]: I0226 19:55:08.076848 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:08 crc kubenswrapper[4722]: E0226 19:55:08.212927 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:55:09 crc kubenswrapper[4722]: I0226 19:55:09.074739 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:10 crc kubenswrapper[4722]: I0226 19:55:10.077724 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:11 crc kubenswrapper[4722]: I0226 19:55:11.078936 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:12 crc kubenswrapper[4722]: I0226 19:55:12.079854 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:13 crc kubenswrapper[4722]: I0226 19:55:13.080554 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:13 crc kubenswrapper[4722]: I0226 19:55:13.089795 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:13 crc kubenswrapper[4722]: I0226 19:55:13.091641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:13 crc kubenswrapper[4722]: I0226 19:55:13.091680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:13 crc kubenswrapper[4722]: I0226 19:55:13.091693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:13 crc kubenswrapper[4722]: I0226 19:55:13.091716 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:55:13 crc kubenswrapper[4722]: E0226 19:55:13.097242 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 19:55:13 crc kubenswrapper[4722]: E0226 19:55:13.097303 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.079593 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.145533 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.146971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.147044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.147063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.148200 4722 scope.go:117] "RemoveContainer" containerID="18b1961e72ac3bfb0cc799b9cd96863db98507ca7ea7f9fed6f87f349c1d8e57" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.278499 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.278694 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.280972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.281008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.281019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.373624 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.374873 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe"} Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.375019 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.375753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.375786 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.375799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.078004 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.379443 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.380037 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.382644 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" exitCode=255 Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.382692 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe"} Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.382727 4722 scope.go:117] "RemoveContainer" containerID="18b1961e72ac3bfb0cc799b9cd96863db98507ca7ea7f9fed6f87f349c1d8e57" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.382977 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.384332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.384360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.384371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.384869 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:55:15 crc kubenswrapper[4722]: E0226 19:55:15.385070 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:55:16 crc kubenswrapper[4722]: I0226 19:55:16.077534 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:16 crc kubenswrapper[4722]: I0226 19:55:16.387964 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 19:55:17 crc kubenswrapper[4722]: I0226 19:55:17.077157 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:18 crc kubenswrapper[4722]: I0226 19:55:18.077675 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:18 crc kubenswrapper[4722]: E0226 19:55:18.213987 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:55:19 crc kubenswrapper[4722]: I0226 19:55:19.077810 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.079365 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.098442 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.099966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.100030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.100050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.100095 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:55:20 crc kubenswrapper[4722]: E0226 19:55:20.103940 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 19:55:20 crc kubenswrapper[4722]: E0226 19:55:20.103968 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.188669 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.188941 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.190668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.190760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.190775 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.191524 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:55:20 crc kubenswrapper[4722]: E0226 19:55:20.191753 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.079612 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.449110 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.449335 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.450385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.450443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.450462 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.451315 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:55:21 crc kubenswrapper[4722]: E0226 19:55:21.451581 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:55:22 crc kubenswrapper[4722]: I0226 19:55:22.079850 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:22 crc kubenswrapper[4722]: I0226 19:55:22.510405 4722 csr.go:261] certificate signing request csr-zjbr8 is approved, waiting to be issued Feb 26 19:55:22 crc kubenswrapper[4722]: I0226 19:55:22.518352 4722 csr.go:257] certificate signing request csr-zjbr8 is issued Feb 26 19:55:22 crc kubenswrapper[4722]: I0226 19:55:22.580569 4722 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 26 19:55:22 crc kubenswrapper[4722]: I0226 19:55:22.931655 4722 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 26 19:55:23 crc kubenswrapper[4722]: I0226 19:55:23.520186 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-15 06:34:27.817284462 +0000 UTC Feb 26 19:55:23 crc kubenswrapper[4722]: I0226 19:55:23.520239 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6274h39m4.29704919s for next certificate rotation Feb 26 19:55:25 crc kubenswrapper[4722]: I0226 19:55:25.145549 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:25 crc kubenswrapper[4722]: I0226 19:55:25.146656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:25 crc kubenswrapper[4722]: I0226 19:55:25.146704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:25 crc kubenswrapper[4722]: I0226 19:55:25.146718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.104930 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.106048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.106081 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.106091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.106180 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.114193 4722 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.114447 4722 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.114464 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.118229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.118258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.118269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.118284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.118296 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:27Z","lastTransitionTime":"2026-02-26T19:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.129434 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.136923 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.136995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.137009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.137028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.137040 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:27Z","lastTransitionTime":"2026-02-26T19:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.154411 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.162893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.162960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.162976 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.162995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.163008 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:27Z","lastTransitionTime":"2026-02-26T19:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.173275 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.179027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.179086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.179095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.179109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.179118 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:27Z","lastTransitionTime":"2026-02-26T19:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.186776 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.186988 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.187025 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.287091 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.388117 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.488528 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.589555 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.690573 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.791305 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.892164 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.992591 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.093308 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.193758 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.214102 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.294383 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.395297 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.496128 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.596976 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.697172 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.797973 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.899238 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.999372 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.100083 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.200501 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.302042 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.402226 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.502780 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.603215 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.704074 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.804900 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.905756 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.006004 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.107120 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.207331 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.308499 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.408892 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.509861 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.610604 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.711661 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.812518 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.913460 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.013585 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.114326 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.215157 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.316097 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.416726 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.517037 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.617861 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.718956 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.819960 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.920633 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.021076 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.121453 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: I0226 19:55:32.145005 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:32 crc kubenswrapper[4722]: I0226 19:55:32.146458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:32 crc kubenswrapper[4722]: I0226 19:55:32.146485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:32 crc kubenswrapper[4722]: I0226 19:55:32.146493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:32 crc kubenswrapper[4722]: I0226 19:55:32.147034 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.147243 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.222100 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.322226 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.423221 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.523452 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.623862 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.724920 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.825839 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.926945 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.028074 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.128635 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.229250 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.330261 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.430468 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.530706 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.631684 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.732611 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.832941 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.933998 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.034754 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.135551 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.236951 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.337074 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.437541 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.538685 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.639848 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.740099 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.840228 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.940376 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.041417 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.142164 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.243060 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.344335 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.444494 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.544908 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.645619 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.746754 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.847292 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.948469 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.049221 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.149730 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.250409 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.350761 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.452222 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.553362 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.653907 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.754549 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.855478 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.956646 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.057665 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.158610 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.258704 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.359574 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.460253 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.534215 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.538128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.538188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.538198 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.538212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.538243 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:37Z","lastTransitionTime":"2026-02-26T19:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.547035 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.554930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.554966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.554977 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.554995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.555008 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:37Z","lastTransitionTime":"2026-02-26T19:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.564499 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.571054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.571091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.571099 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.571115 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.571126 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:37Z","lastTransitionTime":"2026-02-26T19:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.584290 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.591273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.591305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.591318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.591332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.591341 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:37Z","lastTransitionTime":"2026-02-26T19:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.605555 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.605666 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.605688 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.706754 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.807359 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.907845 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.008493 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.109368 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.209742 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.215045 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.310859 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.411254 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.511975 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.612429 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.713048 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.814246 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.915049 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: I0226 19:55:39.010092 4722 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.015261 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.115831 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.216754 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.318180 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.419228 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.519552 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.620313 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.721481 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.822606 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.923666 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.024377 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.124715 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: I0226 19:55:40.145769 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:40 crc kubenswrapper[4722]: I0226 19:55:40.147108 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:40 crc kubenswrapper[4722]: I0226 19:55:40.147234 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:40 crc kubenswrapper[4722]: I0226 19:55:40.147253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.225240 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.326473 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.427397 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.528420 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.629010 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.730244 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.830732 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.931716 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.032989 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.133839 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.235124 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.335689 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.436380 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.537160 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.638046 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.739325 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.840223 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.940723 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.040910 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.141606 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.241797 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.342526 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.443717 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.545112 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.645967 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.746099 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.846993 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.948033 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.048306 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.149014 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.250435 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.351355 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.453326 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.554466 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: I0226 19:55:43.588664 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.655640 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.756472 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.856923 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.957517 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.058658 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.158882 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.259666 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.360522 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.460988 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.561944 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.662682 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.763514 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.864452 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.965334 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.066193 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.167050 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.267153 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.367894 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.468924 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.569455 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.669927 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.770884 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.871988 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.972219 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.072515 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: I0226 19:55:46.145281 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:46 crc kubenswrapper[4722]: I0226 19:55:46.146104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:46 crc kubenswrapper[4722]: I0226 19:55:46.146133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:46 crc kubenswrapper[4722]: I0226 19:55:46.146160 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:46 crc kubenswrapper[4722]: I0226 19:55:46.146661 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.146817 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.173451 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.274567 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.375535 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.475989 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.576270 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.676347 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.776882 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.877391 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.977664 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.078573 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.179271 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.280344 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.380705 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.481393 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.582339 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.612148 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.683413 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.737964 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.745960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.745990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.746003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.746019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.746031 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:47Z","lastTransitionTime":"2026-02-26T19:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.758948 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.763007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.763054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.763065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.763082 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.763095 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:47Z","lastTransitionTime":"2026-02-26T19:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.773632 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.777583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.777658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.777677 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.777701 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.777721 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:47Z","lastTransitionTime":"2026-02-26T19:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.793931 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.799867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.799919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.799935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.799956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.799972 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:47Z","lastTransitionTime":"2026-02-26T19:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.809482 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.809625 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.809659 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.910478 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.011350 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.112197 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.212737 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.215987 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.313675 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.414697 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.514977 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.615588 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.659802 4722 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.717187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.717221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.717235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.717249 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.717260 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:48Z","lastTransitionTime":"2026-02-26T19:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.819485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.819525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.819537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.819552 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.819563 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:48Z","lastTransitionTime":"2026-02-26T19:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.921998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.922054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.922073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.922130 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.922172 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:48Z","lastTransitionTime":"2026-02-26T19:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.024422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.024467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.024478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.024493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.024506 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.118711 4722 apiserver.go:52] "Watching apiserver" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.123349 4722 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.123659 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.124241 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.124313 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.124377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.124245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.124426 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.125260 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.125976 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.126029 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.125983 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.126900 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.126985 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.127042 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.127130 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.127186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.127200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.127218 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.127235 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.128335 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.128699 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.129460 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.129522 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.130025 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.131061 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.147735 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.162741 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.172722 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.181525 4722 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.191031 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.201483 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.209787 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.218317 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.231066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.231102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.231113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.231147 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.231160 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254312 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254340 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254365 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254392 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254417 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254466 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254488 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254512 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254536 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254783 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254813 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254838 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254862 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254915 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254939 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254962 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254986 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255008 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255031 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255059 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255076 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255123 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255163 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255230 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255257 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255303 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255327 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255349 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255378 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255401 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255425 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255446 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255490 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255513 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255537 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255581 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255603 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255627 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255648 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255672 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255693 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255714 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255718 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255756 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255782 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255804 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255827 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255849 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255869 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255889 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255910 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255933 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255956 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255979 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256026 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256050 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256090 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256115 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256156 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256185 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256213 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256240 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256264 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256288 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256313 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256335 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256359 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256382 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256406 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256427 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256452 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256475 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256507 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256530 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256553 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256577 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256602 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256624 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256651 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256677 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256753 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256800 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256823 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256847 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256870 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256892 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256915 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256939 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256961 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256985 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257009 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257037 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257062 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257086 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257108 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257131 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257172 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257195 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257219 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257244 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257246 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257267 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257352 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257396 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257440 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257476 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257511 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257543 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257578 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257612 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257645 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257687 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257736 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257787 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257834 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257868 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257877 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257903 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257938 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257985 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258020 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258056 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258192 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258243 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258318 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258353 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258388 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258421 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258464 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258507 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258541 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258580 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258615 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258651 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258690 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258730 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258764 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258800 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258842 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258878 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258934 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258974 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259009 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259045 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259098 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259174 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259220 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259261 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259297 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259331 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259367 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259403 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259441 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259478 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259518 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259553 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259594 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259647 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259693 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259768 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259803 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259841 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259880 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259955 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260016 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260122 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260190 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260229 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260304 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260340 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260380 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260424 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260460 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260495 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260531 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260566 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260603 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260640 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260675 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260745 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260813 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260926 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261064 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261102 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261460 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261725 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261776 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261815 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261910 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262047 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262109 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262132 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262180 4722 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263073 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258179 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258279 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258428 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258615 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258692 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259014 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259190 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259489 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259966 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259960 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260293 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260387 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260400 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260451 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260838 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261277 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261389 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261481 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262181 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.267571 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:55:49.767549303 +0000 UTC m=+92.304517227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262400 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262883 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262932 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263058 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263173 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263178 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263424 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263636 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263776 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.264017 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.264430 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.264654 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.264692 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.264747 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.264915 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.265266 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.265275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.265361 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.265366 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.265439 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.265571 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266298 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266347 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266363 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266428 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266601 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266638 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266763 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266987 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.267013 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.267412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.267741 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.267416 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.267930 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.268275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.268294 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.268565 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.268520 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.268590 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.269097 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.269244 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.269456 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.269813 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270170 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270202 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270228 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270175 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270332 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270465 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271711 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271842 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271896 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271998 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271927 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271157 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271129 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271286 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272232 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272291 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270477 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272438 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272486 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272699 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272754 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272760 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272961 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272928 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.273202 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.273239 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.273267 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.273486 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.273485 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.273874 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.274121 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.274164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.274204 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.274583 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.274824 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.274921 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275194 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275267 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275376 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275354 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275661 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275692 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275719 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275859 4722 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.276075 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.276188 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.276528 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.276567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.276612 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.277153 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.277499 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.277729 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.277841 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:49.777812971 +0000 UTC m=+92.314780935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278263 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278327 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278510 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278653 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278671 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.278668 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278687 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.278753 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:49.778732346 +0000 UTC m=+92.315700350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278766 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278924 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279118 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279166 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279197 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279340 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279511 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279710 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279965 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279099 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.280594 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.280634 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.281177 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.285798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.289351 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.294438 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.294642 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.291127 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.295865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.299486 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.291778 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.294914 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.295187 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.295290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.295479 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.295728 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.295589 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.296629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.296646 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.296690 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.296899 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.297260 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.298597 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.299510 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.299803 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.299955 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.300235 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.300243 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.300256 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.300458 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.301271 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.301207 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:49.801179785 +0000 UTC m=+92.338147719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.301995 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:49.801970016 +0000 UTC m=+92.338937990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.303012 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.303743 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.304732 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.304911 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.304947 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.305290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.306632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.308218 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.308426 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.308537 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.308953 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309023 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309098 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309202 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309575 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309771 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309779 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309817 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.310194 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.310368 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.311442 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.311837 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.312182 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.312259 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.312710 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.313231 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.315344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.315912 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.318563 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.334187 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.334263 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.334287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.334296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.334311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.334322 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.335164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.338593 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362795 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362843 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362892 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362903 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362914 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362925 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362934 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362944 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362952 4722 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362961 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362969 4722 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362979 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362987 4722 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362995 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363004 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363012 4722 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363020 4722 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363030 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363040 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363048 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363057 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363068 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363079 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363088 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363096 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363104 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363112 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363122 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363148 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363157 4722 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363166 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363174 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363183 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363191 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363199 4722 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363207 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363216 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363231 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363240 4722 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363248 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363258 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363266 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363275 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363282 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363291 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363298 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363306 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363314 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363322 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363332 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363340 4722 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363348 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363358 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363367 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363378 4722 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363389 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363399 4722 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363407 4722 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363415 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363424 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363432 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363442 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363452 4722 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363503 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363550 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363602 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363619 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363665 4722 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363682 4722 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363702 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363713 4722 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363725 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363736 4722 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363746 4722 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363758 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363773 4722 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363784 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363794 4722 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363803 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363811 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363820 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363829 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363838 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363847 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363857 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363866 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363875 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363884 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363894 4722 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363903 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363911 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363921 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363929 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363938 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363947 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363955 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363963 4722 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363975 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363984 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363993 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364001 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364009 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364018 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364027 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364037 4722 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364045 4722 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364054 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364063 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364072 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364081 4722 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364090 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364099 4722 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364108 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364116 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364125 4722 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364145 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364154 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364169 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364177 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364186 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364196 4722 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364205 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364214 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364223 4722 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364232 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364240 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364249 4722 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364258 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364266 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364275 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364284 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364293 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364302 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364314 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364325 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364336 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364346 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364354 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364363 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364372 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364382 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364391 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364399 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364407 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364416 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364426 4722 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364434 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364443 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364456 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364466 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364476 4722 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364484 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364492 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364500 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364509 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364518 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364527 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364536 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364545 4722 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364554 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364563 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364571 4722 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364580 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364588 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364597 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364606 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364614 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364622 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364632 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364641 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364649 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364658 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364666 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364674 4722 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364683 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364693 4722 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364701 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364710 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364722 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364734 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364748 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364762 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364772 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364781 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364791 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364800 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364809 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364817 4722 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364826 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364836 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364845 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364854 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364863 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364872 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.436769 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.436817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.436830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.436848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.436860 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.442943 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.448473 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.454268 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: W0226 19:55:49.469202 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-a75bffe7196cbc824f441e9721163fb43bda1aafa4edb6c7522177a5dd1c4948 WatchSource:0}: Error finding container a75bffe7196cbc824f441e9721163fb43bda1aafa4edb6c7522177a5dd1c4948: Status 404 returned error can't find the container with id a75bffe7196cbc824f441e9721163fb43bda1aafa4edb6c7522177a5dd1c4948 Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.539797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.540257 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.540269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.540287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.540300 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.642575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.642603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.642611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.642625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.642634 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.744495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.744531 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.744540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.744556 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.744568 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.768176 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.768334 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:55:50.768302771 +0000 UTC m=+93.305270695 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.846305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.846334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.846342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.846356 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.846366 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.868829 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.868925 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.868952 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.868981 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869115 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869160 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869174 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869210 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869229 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:50.86921201 +0000 UTC m=+93.406179934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869235 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869247 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869252 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869275 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869290 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:50.869278991 +0000 UTC m=+93.406246915 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869439 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:50.869403545 +0000 UTC m=+93.406371519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869463 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:50.869451396 +0000 UTC m=+93.406419350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.948632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.948670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.948678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.948692 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.948700 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.050631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.050665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.050675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.050691 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.050701 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.149797 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.150662 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.151349 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.152181 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.152813 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153441 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153466 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153996 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.154567 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.155200 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.155856 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.156444 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.157116 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.157667 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.158237 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.158750 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.160234 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.161884 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.162817 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.164534 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.165260 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.165736 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.166813 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.167364 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.168412 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.168836 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.169869 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.170589 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.171467 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.172081 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.172543 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.173497 4722 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.173772 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.175371 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.176316 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.176737 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.178181 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.181663 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.183222 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.184162 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.185572 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.186168 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.187426 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.188669 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.189429 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.190457 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.191168 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.192236 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.193046 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.194206 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.194811 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.195461 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.196554 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.197325 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.198458 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.256773 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.256813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.256822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.256835 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.256845 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.359303 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.359346 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.359356 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.359373 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.359384 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.461433 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.461481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.461494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.461512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.461525 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.469664 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a75bffe7196cbc824f441e9721163fb43bda1aafa4edb6c7522177a5dd1c4948"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.470903 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.470956 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.470970 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"865847439ece32053028128d2318f94da449ca7278343943b0d9df702e35c020"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.471930 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.472102 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"758406ff8fc1b5ea2c20787c83732ea59e7f925af18fede22e406590bb120ee9"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.485015 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.496716 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.508966 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.522895 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.537595 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.549186 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.563156 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.563195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.563207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.563221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.563230 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.563942 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.575242 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.587637 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.600642 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.612353 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.622782 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.665508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.665558 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.665575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.665591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.665601 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.768109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.768150 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.768159 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.768171 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.768181 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.777389 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.777533 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:55:52.777513177 +0000 UTC m=+95.314481111 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.870590 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.870627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.870635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.870647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.870656 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.878348 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.878383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.878406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.878435 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878517 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878518 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878568 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:52.878555129 +0000 UTC m=+95.415523053 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878519 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878616 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878631 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878602 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:52.8785901 +0000 UTC m=+95.415558024 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878685 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:52.878674312 +0000 UTC m=+95.415642236 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878726 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878736 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878743 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878777 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:52.878770635 +0000 UTC m=+95.415738559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.973097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.973155 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.973165 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.973179 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.973188 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.075654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.075686 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.075694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.075706 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.075714 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.145552 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.145567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:51 crc kubenswrapper[4722]: E0226 19:55:51.145693 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:55:51 crc kubenswrapper[4722]: E0226 19:55:51.145792 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.145567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:51 crc kubenswrapper[4722]: E0226 19:55:51.145870 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.177587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.177627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.177637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.177651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.177660 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.279990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.280042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.280057 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.280079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.280105 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.382329 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.382368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.382379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.382395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.382407 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.484341 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.484375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.484383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.484395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.484404 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.586272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.586308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.586316 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.586330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.586338 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.688530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.688581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.688592 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.688610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.688623 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.790586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.790639 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.790649 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.790667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.790679 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.893989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.894026 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.894038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.894053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.894061 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.996223 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.996261 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.996269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.996284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.996293 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.098841 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.098899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.098917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.098940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.098957 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.200695 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.200763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.200790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.200819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.200837 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.303692 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.303742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.303773 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.303792 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.303803 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.406369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.406414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.406425 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.406443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.406455 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.476795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.490890 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.502446 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.508678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.508715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.508724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.508737 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.508746 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.516765 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.530906 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.542935 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.557007 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.611163 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.611204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.611215 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.611232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.611245 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.713642 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.713720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.713737 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.713770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.713786 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.771854 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-glv66"] Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.772181 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.775693 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.776062 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.776124 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.790417 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.796411 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.796590 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:55:56.796565156 +0000 UTC m=+99.333533140 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.803614 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.816129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.816189 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.816220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.816374 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.816389 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.816398 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.828925 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.842928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.861616 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.876856 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.897174 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d52a6245-586b-400a-9515-e6b76a677070-hosts-file\") pod \"node-resolver-glv66\" (UID: \"d52a6245-586b-400a-9515-e6b76a677070\") " pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.897243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.897308 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.897348 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.897382 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8zld\" (UniqueName: \"kubernetes.io/projected/d52a6245-586b-400a-9515-e6b76a677070-kube-api-access-p8zld\") pod \"node-resolver-glv66\" (UID: \"d52a6245-586b-400a-9515-e6b76a677070\") " pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.897415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897492 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897543 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897563 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:56.897542277 +0000 UTC m=+99.434510221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897553 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897615 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897635 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897572 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897670 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897716 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:56.897704081 +0000 UTC m=+99.434672005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897746 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:56.897733662 +0000 UTC m=+99.434701806 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897773 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897932 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:56.897876176 +0000 UTC m=+99.434844130 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.919245 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.919301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.919318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.919343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.919363 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.997930 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d52a6245-586b-400a-9515-e6b76a677070-hosts-file\") pod \"node-resolver-glv66\" (UID: \"d52a6245-586b-400a-9515-e6b76a677070\") " pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.997983 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8zld\" (UniqueName: \"kubernetes.io/projected/d52a6245-586b-400a-9515-e6b76a677070-kube-api-access-p8zld\") pod \"node-resolver-glv66\" (UID: \"d52a6245-586b-400a-9515-e6b76a677070\") " pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.998178 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d52a6245-586b-400a-9515-e6b76a677070-hosts-file\") pod \"node-resolver-glv66\" (UID: \"d52a6245-586b-400a-9515-e6b76a677070\") " pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.018012 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8zld\" (UniqueName: \"kubernetes.io/projected/d52a6245-586b-400a-9515-e6b76a677070-kube-api-access-p8zld\") pod \"node-resolver-glv66\" (UID: \"d52a6245-586b-400a-9515-e6b76a677070\") " pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.022444 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.022484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.022496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.022515 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.022529 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.086783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:53 crc kubenswrapper[4722]: W0226 19:55:53.106070 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd52a6245_586b_400a_9515_e6b76a677070.slice/crio-daedb2b8d6b2d40c88f63cf1cf7d10b48b221441af34edee735e2f1ebd760ba2 WatchSource:0}: Error finding container daedb2b8d6b2d40c88f63cf1cf7d10b48b221441af34edee735e2f1ebd760ba2: Status 404 returned error can't find the container with id daedb2b8d6b2d40c88f63cf1cf7d10b48b221441af34edee735e2f1ebd760ba2 Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.125627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.125693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.125721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.125749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.125767 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.145775 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:53 crc kubenswrapper[4722]: E0226 19:55:53.145938 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.146242 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.146287 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:53 crc kubenswrapper[4722]: E0226 19:55:53.146344 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:55:53 crc kubenswrapper[4722]: E0226 19:55:53.146402 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.153600 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-cgjxc"] Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.153896 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-p2glm"] Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.154412 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-cfwh9"] Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.154635 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.154953 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.155285 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.159286 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.159648 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.159830 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.159893 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.159977 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.161001 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.161309 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.161371 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.161361 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.161596 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.162258 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.163062 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.176343 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.200932 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.215503 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.229236 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.229288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.229304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.229323 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.229334 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.236165 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.257197 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.267605 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.278621 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.293560 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.300974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301048 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-multus-certs\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-etc-kubernetes\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301094 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4362c7f7-66ad-4400-af35-0877842d717e-cni-binary-copy\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301116 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-k8s-cni-cncf-io\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301177 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-os-release\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301245 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/35d6419f-1ddb-4df3-9da4-00b4b088a818-rootfs\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301309 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-cnibin\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301338 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-system-cni-dir\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301364 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-cnibin\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301387 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2bb99326-dd22-4186-84da-ba208f104cd6-cni-binary-copy\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-cni-bin\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301451 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-os-release\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301477 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-socket-dir-parent\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301498 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-kubelet\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301521 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-cni-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301543 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-conf-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301564 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2bb99326-dd22-4186-84da-ba208f104cd6-multus-daemon-config\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-system-cni-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301608 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35d6419f-1ddb-4df3-9da4-00b4b088a818-proxy-tls\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301631 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-netns\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301653 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/35d6419f-1ddb-4df3-9da4-00b4b088a818-mcd-auth-proxy-config\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301753 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdhl\" (UniqueName: \"kubernetes.io/projected/35d6419f-1ddb-4df3-9da4-00b4b088a818-kube-api-access-thdhl\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301792 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg5bm\" (UniqueName: \"kubernetes.io/projected/4362c7f7-66ad-4400-af35-0877842d717e-kube-api-access-cg5bm\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301822 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-hostroot\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2wqh\" (UniqueName: \"kubernetes.io/projected/2bb99326-dd22-4186-84da-ba208f104cd6-kube-api-access-x2wqh\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301879 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4362c7f7-66ad-4400-af35-0877842d717e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301903 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-cni-multus\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.307982 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.322472 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.332297 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.332353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.332367 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.332385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.332408 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.338349 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.353387 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.373635 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.386337 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.399344 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402346 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-netns\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402382 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/35d6419f-1ddb-4df3-9da4-00b4b088a818-mcd-auth-proxy-config\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402408 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdhl\" (UniqueName: \"kubernetes.io/projected/35d6419f-1ddb-4df3-9da4-00b4b088a818-kube-api-access-thdhl\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402432 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg5bm\" (UniqueName: \"kubernetes.io/projected/4362c7f7-66ad-4400-af35-0877842d717e-kube-api-access-cg5bm\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402450 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2wqh\" (UniqueName: \"kubernetes.io/projected/2bb99326-dd22-4186-84da-ba208f104cd6-kube-api-access-x2wqh\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4362c7f7-66ad-4400-af35-0877842d717e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402481 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-hostroot\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-cni-multus\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402499 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-netns\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402517 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-multus-certs\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-etc-kubernetes\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402546 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402561 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4362c7f7-66ad-4400-af35-0877842d717e-cni-binary-copy\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402587 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-k8s-cni-cncf-io\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402602 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/35d6419f-1ddb-4df3-9da4-00b4b088a818-rootfs\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402601 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-cni-multus\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-os-release\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402658 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-cnibin\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/35d6419f-1ddb-4df3-9da4-00b4b088a818-rootfs\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402659 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-k8s-cni-cncf-io\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402679 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-system-cni-dir\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-multus-certs\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402716 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-cnibin\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402726 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-hostroot\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402734 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-cnibin\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402659 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-os-release\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-cnibin\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-os-release\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402783 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2bb99326-dd22-4186-84da-ba208f104cd6-cni-binary-copy\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402799 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-cni-bin\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402819 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-socket-dir-parent\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402837 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-kubelet\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402841 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-os-release\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-cni-bin\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-cni-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-conf-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402904 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2bb99326-dd22-4186-84da-ba208f104cd6-multus-daemon-config\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402919 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-system-cni-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402934 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35d6419f-1ddb-4df3-9da4-00b4b088a818-proxy-tls\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402995 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-socket-dir-parent\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403001 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-cni-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-etc-kubernetes\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403031 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-kubelet\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403032 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-conf-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-system-cni-dir\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403176 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-system-cni-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403175 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4362c7f7-66ad-4400-af35-0877842d717e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403460 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2bb99326-dd22-4186-84da-ba208f104cd6-cni-binary-copy\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403583 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2bb99326-dd22-4186-84da-ba208f104cd6-multus-daemon-config\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.404024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/35d6419f-1ddb-4df3-9da4-00b4b088a818-mcd-auth-proxy-config\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403494 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4362c7f7-66ad-4400-af35-0877842d717e-cni-binary-copy\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.410494 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35d6419f-1ddb-4df3-9da4-00b4b088a818-proxy-tls\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.417995 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2wqh\" (UniqueName: \"kubernetes.io/projected/2bb99326-dd22-4186-84da-ba208f104cd6-kube-api-access-x2wqh\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.419036 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.419911 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdhl\" (UniqueName: \"kubernetes.io/projected/35d6419f-1ddb-4df3-9da4-00b4b088a818-kube-api-access-thdhl\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.420502 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg5bm\" (UniqueName: \"kubernetes.io/projected/4362c7f7-66ad-4400-af35-0877842d717e-kube-api-access-cg5bm\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.431736 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.434544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.434597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.434609 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.434628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.434639 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.442811 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.452083 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.462974 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.476207 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.480252 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-glv66" event={"ID":"d52a6245-586b-400a-9515-e6b76a677070","Type":"ContainerStarted","Data":"1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.480312 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-glv66" event={"ID":"d52a6245-586b-400a-9515-e6b76a677070","Type":"ContainerStarted","Data":"daedb2b8d6b2d40c88f63cf1cf7d10b48b221441af34edee735e2f1ebd760ba2"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.486331 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: W0226 19:55:53.486688 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bb99326_dd22_4186_84da_ba208f104cd6.slice/crio-2c9e8b0b8448f5af3a9fc6b5ce8b03f82d12031b448dca400dafcdf51e541309 WatchSource:0}: Error finding container 2c9e8b0b8448f5af3a9fc6b5ce8b03f82d12031b448dca400dafcdf51e541309: Status 404 returned error can't find the container with id 2c9e8b0b8448f5af3a9fc6b5ce8b03f82d12031b448dca400dafcdf51e541309 Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.493789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.495153 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: W0226 19:55:53.496359 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d6419f_1ddb_4df3_9da4_00b4b088a818.slice/crio-a2365edaf059dcc81e710d1fcb1d2ded3d3ba2eb7f4915a23cfed7c9f527aa01 WatchSource:0}: Error finding container a2365edaf059dcc81e710d1fcb1d2ded3d3ba2eb7f4915a23cfed7c9f527aa01: Status 404 returned error can't find the container with id a2365edaf059dcc81e710d1fcb1d2ded3d3ba2eb7f4915a23cfed7c9f527aa01 Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.508220 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.525058 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.537097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.537158 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.537173 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.537195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.537208 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.542228 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bqmjx"] Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.543039 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.546457 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.546657 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.546470 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.546872 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.548443 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.549653 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.550209 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.551379 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.570070 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.583857 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.595736 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.607017 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.620902 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.639651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.639677 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.639684 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.639697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.639706 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.644181 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.655004 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.666883 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.677758 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.688707 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705357 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705490 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-slash\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705521 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-log-socket\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovn-node-metrics-cert\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705556 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-systemd-units\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705574 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-etc-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705589 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-ovn\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705673 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-var-lib-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705686 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-env-overrides\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705726 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-script-lib\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705743 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-node-log\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705758 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705772 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-config\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705788 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-netns\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705818 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-systemd\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-bin\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705852 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-kubelet\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-netd\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705902 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdlkp\" (UniqueName: \"kubernetes.io/projected/110fea1c-1463-40d7-bb4b-1825d5b706f0-kube-api-access-vdlkp\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.718586 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.736065 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.741605 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.741648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.741659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.741675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.741687 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.747871 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.759100 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.770690 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.781360 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806734 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-systemd-units\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806775 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-etc-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806790 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-ovn\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806816 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806834 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-var-lib-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-env-overrides\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806872 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-script-lib\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-node-log\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-systemd-units\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806946 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806907 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806956 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-ovn\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807001 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-var-lib-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806985 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806982 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-config\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807132 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-node-log\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807191 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-etc-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-netns\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807285 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-systemd\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807304 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-bin\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-kubelet\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807359 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-netd\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdlkp\" (UniqueName: \"kubernetes.io/projected/110fea1c-1463-40d7-bb4b-1825d5b706f0-kube-api-access-vdlkp\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807442 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-slash\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807468 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-log-socket\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807490 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovn-node-metrics-cert\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807670 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-env-overrides\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807691 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-script-lib\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807789 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-slash\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807830 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-netd\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807842 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-log-socket\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807842 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-kubelet\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807877 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-netns\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807909 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-bin\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807894 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-systemd\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.808165 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-config\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.811574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovn-node-metrics-cert\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.826930 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdlkp\" (UniqueName: \"kubernetes.io/projected/110fea1c-1463-40d7-bb4b-1825d5b706f0-kube-api-access-vdlkp\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.844507 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.844541 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.844552 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.844567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.844579 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.884339 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: W0226 19:55:53.895401 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110fea1c_1463_40d7_bb4b_1825d5b706f0.slice/crio-4fce7b880d678b13609fc703e455012610c169055f8523bb5981b30b1c777cbe WatchSource:0}: Error finding container 4fce7b880d678b13609fc703e455012610c169055f8523bb5981b30b1c777cbe: Status 404 returned error can't find the container with id 4fce7b880d678b13609fc703e455012610c169055f8523bb5981b30b1c777cbe Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.946758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.946796 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.946805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.946819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.946829 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.048758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.048785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.048795 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.048811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.048823 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.150301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.150384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.150398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.150421 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.150436 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.252914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.252943 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.252952 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.252964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.252973 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.355123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.355178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.355187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.355204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.355216 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.457694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.457747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.457759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.457777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.457789 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.485389 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" exitCode=0 Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.485468 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.485519 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"4fce7b880d678b13609fc703e455012610c169055f8523bb5981b30b1c777cbe"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.487366 4722 generic.go:334] "Generic (PLEG): container finished" podID="4362c7f7-66ad-4400-af35-0877842d717e" containerID="1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d" exitCode=0 Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.487411 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerDied","Data":"1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.487470 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerStarted","Data":"34fb3b2053be59c656cd3d226c7dadf25248cb4706fe29da6c88e5634c7d3a9f"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.502488 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.503224 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.503258 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"a2365edaf059dcc81e710d1fcb1d2ded3d3ba2eb7f4915a23cfed7c9f527aa01"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.503590 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cfwh9" event={"ID":"2bb99326-dd22-4186-84da-ba208f104cd6","Type":"ContainerStarted","Data":"0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.503628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cfwh9" event={"ID":"2bb99326-dd22-4186-84da-ba208f104cd6","Type":"ContainerStarted","Data":"2c9e8b0b8448f5af3a9fc6b5ce8b03f82d12031b448dca400dafcdf51e541309"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.505801 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.546584 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.560708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.560753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.560764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.560780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.560792 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.565930 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.580712 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.594825 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.608104 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.618512 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.639902 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.650948 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.660647 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.663522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.663551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.663559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.663572 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.663582 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.669342 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.680232 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.691598 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.702332 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.713309 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.731620 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.746606 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.757222 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.765828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.765863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.765872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.765887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.765897 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.771868 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.785796 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.796433 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.813094 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.867717 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.867750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.867759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.867774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.867784 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.970084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.970354 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.970366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.970381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.970391 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.072358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.072387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.072398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.072411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.072420 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.145022 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:55 crc kubenswrapper[4722]: E0226 19:55:55.145252 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.145097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:55 crc kubenswrapper[4722]: E0226 19:55:55.145471 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.145080 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:55 crc kubenswrapper[4722]: E0226 19:55:55.145643 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.176183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.176228 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.176271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.176291 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.176305 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.279097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.279157 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.279172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.279197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.279211 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.380844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.381174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.381184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.381197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.381206 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.483693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.483735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.483746 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.483764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.483776 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.510418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.510455 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.510465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.510474 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.510483 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.510491 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.512150 4722 generic.go:334] "Generic (PLEG): container finished" podID="4362c7f7-66ad-4400-af35-0877842d717e" containerID="485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3" exitCode=0 Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.512156 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerDied","Data":"485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.524080 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.543320 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.552333 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.562312 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.580913 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.585672 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.585709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.585721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.585736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.585750 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.595426 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.611260 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.626351 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.650086 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.668060 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.683119 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.688528 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.688555 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.688567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.688581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.688590 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.790697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.790740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.790750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.790767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.790783 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.894119 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.894244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.894270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.894310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.894339 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.997068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.997116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.997126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.997168 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.997179 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.099631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.099685 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.099710 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.099725 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.099734 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.201727 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.202466 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.202512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.202535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.202555 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.305458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.305503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.305513 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.305532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.305545 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.407765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.407818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.407834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.407855 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.407871 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.510163 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.510212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.510225 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.510242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.510253 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.519231 4722 generic.go:334] "Generic (PLEG): container finished" podID="4362c7f7-66ad-4400-af35-0877842d717e" containerID="b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b" exitCode=0 Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.519274 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerDied","Data":"b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.534995 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.548249 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.559891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.570212 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.583735 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.604342 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.613197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.613243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.613259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.613280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.613298 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.619070 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.631695 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.642833 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.655043 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.678845 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.714973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.715009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.715019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.715035 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.715044 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.817738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.817780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.817789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.817806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.817817 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.844175 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.844270 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:56:04.844246004 +0000 UTC m=+107.381213928 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.920126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.920214 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.920231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.920255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.920274 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.944787 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.944893 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.944904 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.944934 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.944985 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:04.944960497 +0000 UTC m=+107.481928461 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.945025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945033 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945100 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945117 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:04.945104011 +0000 UTC m=+107.482071975 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945119 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945042 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945174 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945186 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945154 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945236 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:04.945219974 +0000 UTC m=+107.482187898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945275 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:04.945249965 +0000 UTC m=+107.482217929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.023749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.023839 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.023866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.023898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.023922 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.126977 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.127037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.127052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.127072 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.127084 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.145348 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.145358 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:57 crc kubenswrapper[4722]: E0226 19:55:57.145541 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:55:57 crc kubenswrapper[4722]: E0226 19:55:57.145683 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.145361 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:57 crc kubenswrapper[4722]: E0226 19:55:57.145882 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.230091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.230168 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.230182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.230199 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.230210 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.332732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.332767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.332777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.332793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.332803 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.435640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.435751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.435774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.435801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.435823 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.524072 4722 generic.go:334] "Generic (PLEG): container finished" podID="4362c7f7-66ad-4400-af35-0877842d717e" containerID="1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be" exitCode=0 Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.524150 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerDied","Data":"1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.537741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.537831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.537860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.537894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.537951 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.539127 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.552667 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.564638 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.577102 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.596187 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.610411 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.624004 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.633256 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.640183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.640210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.640219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.640232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.640242 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.651226 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.664228 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.674770 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.742394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.742427 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.742435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.742450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.742459 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.844640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.844702 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.844719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.844741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.844758 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.947450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.947489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.947502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.947518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.947531 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.042535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.042580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.042591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.042609 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.042621 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: E0226 19:55:58.063542 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.069180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.069243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.069262 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.069290 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.069308 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: E0226 19:55:58.095703 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.100996 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.101052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.101069 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.101093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.101112 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: E0226 19:55:58.120475 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.126775 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.126845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.126866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.127416 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.127695 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: E0226 19:55:58.147807 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.153759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.153799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.153811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.153825 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.153837 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: E0226 19:55:58.167689 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: E0226 19:55:58.167832 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.168448 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.169610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.169818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.169964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.170111 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.170351 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.184678 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.196250 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.209550 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.227112 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.242161 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.255532 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.265487 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.272607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.272673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.272687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.272707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.272740 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.275649 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.284023 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.293502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.375612 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.375665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.375676 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.375692 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.375704 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.477629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.477661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.477670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.477682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.477693 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.529855 4722 generic.go:334] "Generic (PLEG): container finished" podID="4362c7f7-66ad-4400-af35-0877842d717e" containerID="bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7" exitCode=0 Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.529944 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerDied","Data":"bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.535226 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.544253 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.556453 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.568553 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.578457 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.579919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.579947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.579962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.579980 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.579993 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.589219 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.601037 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.611896 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.621815 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.631925 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.643963 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.660360 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.682587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.682633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.682644 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.682658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.682668 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.785250 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.785281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.785290 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.785305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.785313 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.887545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.887587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.887596 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.887611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.887624 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.990224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.990266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.990276 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.990290 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.990300 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.092638 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.092671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.092680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.092693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.092705 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.145169 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.145297 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.145301 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:59 crc kubenswrapper[4722]: E0226 19:55:59.145532 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:55:59 crc kubenswrapper[4722]: E0226 19:55:59.145626 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:55:59 crc kubenswrapper[4722]: E0226 19:55:59.145936 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.195204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.195242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.195250 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.195265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.195276 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.296975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.297009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.297018 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.297033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.297044 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.401029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.401071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.401080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.401093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.401102 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.451444 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-pkptb"] Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.451844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.453744 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.453789 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.453754 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.454382 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.466907 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.483863 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.486178 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-serviceca\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.486225 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-host\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.486277 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dscq2\" (UniqueName: \"kubernetes.io/projected/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-kube-api-access-dscq2\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.494653 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.503499 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.503530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.503540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.503555 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.503568 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.509236 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.523481 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.534179 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.542530 4722 generic.go:334] "Generic (PLEG): container finished" podID="4362c7f7-66ad-4400-af35-0877842d717e" containerID="b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2" exitCode=0 Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.542584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerDied","Data":"b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.551444 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.565246 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.579692 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.586999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dscq2\" (UniqueName: \"kubernetes.io/projected/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-kube-api-access-dscq2\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.587496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-serviceca\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.587588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-host\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.587703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-host\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.589449 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-serviceca\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.590726 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.606534 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.608399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.608428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.608440 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.608458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.608470 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.614066 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dscq2\" (UniqueName: \"kubernetes.io/projected/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-kube-api-access-dscq2\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.624717 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.638402 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.660340 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.670528 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.681900 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.695056 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.708240 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.711087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.711121 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.711129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.711171 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.711183 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.719349 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.732632 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.747858 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.760891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.764388 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.774387 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: W0226 19:55:59.774556 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a1461db_ac2a_4a8e_af9c_ea1b340c91e7.slice/crio-4ea2859cf285695336c98aa2c713aee4194914817f3c40695aa0ee3b6ce07dae WatchSource:0}: Error finding container 4ea2859cf285695336c98aa2c713aee4194914817f3c40695aa0ee3b6ce07dae: Status 404 returned error can't find the container with id 4ea2859cf285695336c98aa2c713aee4194914817f3c40695aa0ee3b6ce07dae Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.786164 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.813966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.814074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.814160 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.814310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.814451 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.916956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.916991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.917001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.917017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.917026 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.019398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.019443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.019455 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.019472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.019485 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.121830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.121870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.121878 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.121896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.121906 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.227495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.227550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.227567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.227596 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.227621 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.330493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.330535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.330544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.330559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.330569 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.434441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.434492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.434507 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.434527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.434542 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.537516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.537570 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.537596 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.537617 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.537634 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.551315 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerStarted","Data":"9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.558404 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.558842 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.558873 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.558912 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.561961 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pkptb" event={"ID":"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7","Type":"ContainerStarted","Data":"150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.562287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pkptb" event={"ID":"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7","Type":"ContainerStarted","Data":"4ea2859cf285695336c98aa2c713aee4194914817f3c40695aa0ee3b6ce07dae"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.571947 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.593224 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.593760 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.595538 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.626672 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.640038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.640084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.640096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.640126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.640157 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.641721 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.655218 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.667288 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.680364 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.691179 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.700336 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.710353 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.723132 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.734800 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.742113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.742192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.742206 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.742227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.742254 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.749184 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.763663 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.783533 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.800893 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.819734 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.831488 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.842252 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.844925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.844959 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.844970 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.844987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.844996 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.856822 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.868073 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.879206 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.891034 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.901052 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.947703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.947761 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.947778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.947802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.947821 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.049841 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.049901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.049913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.049930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.049943 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.145118 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:01 crc kubenswrapper[4722]: E0226 19:56:01.145287 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.145318 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.145418 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:01 crc kubenswrapper[4722]: E0226 19:56:01.145807 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:01 crc kubenswrapper[4722]: E0226 19:56:01.145993 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.152460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.152497 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.152505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.152518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.152527 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.157486 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.158277 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.255163 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.255189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.255197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.255210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.255220 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.357235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.357280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.357292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.357310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.357325 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.459319 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.459351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.459360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.459374 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.459384 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.562908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.562971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.562995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.563022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.563042 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.569118 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.571526 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.597231 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.617886 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.639369 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.661391 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.668770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.668858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.668873 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.668892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.668904 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.695084 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.706943 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.721348 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.733876 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.745297 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.758647 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.770081 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.770600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.770651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.770668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.770688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.770705 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.782158 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.796857 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.872890 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.872927 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.872941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.872964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.872977 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.977228 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.977320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.977332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.977347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.977358 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.078984 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.079023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.079031 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.079043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.079052 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.183324 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.183360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.183368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.183383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.183394 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.285529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.285583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.285604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.285624 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.285637 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.387798 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.387838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.387850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.387866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.387877 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.491070 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.491379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.491391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.491409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.491420 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.575208 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/0.log" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.577414 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513" exitCode=1 Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.577463 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.578946 4722 scope.go:117] "RemoveContainer" containerID="c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.580417 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.592512 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.594033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.594065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.594074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.594088 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.594097 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.603746 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.617710 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.631045 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.644910 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.660211 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:02Z\\\",\\\"message\\\":\\\"nalversions/factory.go:140\\\\nI0226 19:56:02.519174 6499 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 19:56:02.518720 6499 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 19:56:02.519206 6499 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 19:56:02.519331 6499 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0226 19:56:02.519570 6499 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 19:56:02.519609 6499 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0226 19:56:02.520120 6499 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0226 19:56:02.520158 6499 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0226 19:56:02.520183 6499 factory.go:656] Stopping watch factory\\\\nI0226 19:56:02.520184 6499 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 19:56:02.520201 6499 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.673734 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.686557 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.696342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.696378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.696390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.696406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.696416 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.698075 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.709681 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.721309 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.731696 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.743514 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.798965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.799005 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.799017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.799045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.799058 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.901633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.901689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.901705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.901733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.901754 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.004573 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.004611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.004619 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.004633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.004642 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.106840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.106874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.106885 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.106898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.106907 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.145935 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.145996 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.145945 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:03 crc kubenswrapper[4722]: E0226 19:56:03.146063 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:03 crc kubenswrapper[4722]: E0226 19:56:03.146122 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:03 crc kubenswrapper[4722]: E0226 19:56:03.146234 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.209653 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.209715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.209724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.209739 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.209773 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.312860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.312949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.312966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.312992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.313007 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.414700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.414735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.414744 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.414757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.414767 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.516998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.517039 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.517048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.517062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.517071 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.582120 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/1.log" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.582684 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/0.log" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.585162 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed" exitCode=1 Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.585262 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.585308 4722 scope.go:117] "RemoveContainer" containerID="c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.585952 4722 scope.go:117] "RemoveContainer" containerID="1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed" Feb 26 19:56:03 crc kubenswrapper[4722]: E0226 19:56:03.586447 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.602190 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.617049 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.621399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.621465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.621485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.621513 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.621541 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.632950 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.643552 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.656469 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.686239 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:02Z\\\",\\\"message\\\":\\\"nalversions/factory.go:140\\\\nI0226 19:56:02.519174 6499 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 19:56:02.518720 6499 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 19:56:02.519206 6499 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 19:56:02.519331 6499 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0226 19:56:02.519570 6499 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 19:56:02.519609 6499 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0226 19:56:02.520120 6499 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0226 19:56:02.520158 6499 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0226 19:56:02.520183 6499 factory.go:656] Stopping watch factory\\\\nI0226 19:56:02.520184 6499 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 19:56:02.520201 6499 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.702736 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.717710 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.723976 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.724007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.724015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.724028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.724039 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.729502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.741001 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.749129 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.762934 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.778797 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.827130 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.827185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.827194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.827209 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.827219 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.929033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.929071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.929082 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.929098 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.929108 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.031864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.031903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.031913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.031926 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.031935 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.135125 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.135200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.135210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.135224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.135237 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.238757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.238797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.238807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.238822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.238832 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.342435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.342469 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.342480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.342496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.342508 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.445202 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.445288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.445303 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.445325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.445339 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.548919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.549007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.549030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.549073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.549100 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.593584 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/1.log" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.598506 4722 scope.go:117] "RemoveContainer" containerID="1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed" Feb 26 19:56:04 crc kubenswrapper[4722]: E0226 19:56:04.598925 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.628131 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.649198 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.652311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.652339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.652348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.652363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.652377 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.678998 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.695790 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.707710 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.738875 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.751159 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.754713 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.754740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.754749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.754762 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.754771 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.764372 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.776962 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.788867 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.803666 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.817634 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.829044 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.856981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.857024 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.857032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.857048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.857056 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.939702 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:56:04 crc kubenswrapper[4722]: E0226 19:56:04.939870 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:56:20.939842256 +0000 UTC m=+123.476810170 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.959518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.959567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.959578 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.959595 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.959607 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.041483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.041534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.041562 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.041578 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041675 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041679 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041699 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041768 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041720 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:21.041706871 +0000 UTC m=+123.578674785 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041816 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041826 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041842 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:21.041814444 +0000 UTC m=+123.578782378 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041862 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041875 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041875 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:21.041864935 +0000 UTC m=+123.578832869 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041935 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:21.041918657 +0000 UTC m=+123.578886581 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.061968 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.061998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.062007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.062021 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.062029 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.145240 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.145273 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.145254 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.145371 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.145483 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.145563 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.163753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.163800 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.163809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.163822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.163834 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.265285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.265325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.265347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.265364 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.265375 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.352128 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d"] Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.352553 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.353872 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.354067 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.367189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.367219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.367246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.367258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.367267 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.371105 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.382263 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.393630 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.406752 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.416446 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.431373 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.445131 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90724380-7f87-4ab9-955a-71f8c75db52f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.445228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90724380-7f87-4ab9-955a-71f8c75db52f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.445246 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90724380-7f87-4ab9-955a-71f8c75db52f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.445262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmqqp\" (UniqueName: \"kubernetes.io/projected/90724380-7f87-4ab9-955a-71f8c75db52f-kube-api-access-jmqqp\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.450357 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.463372 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.469652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.469689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.469700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.469719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.469730 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.475803 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.487985 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.503222 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.514504 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.524399 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.534354 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.545937 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90724380-7f87-4ab9-955a-71f8c75db52f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.545964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90724380-7f87-4ab9-955a-71f8c75db52f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.545983 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmqqp\" (UniqueName: \"kubernetes.io/projected/90724380-7f87-4ab9-955a-71f8c75db52f-kube-api-access-jmqqp\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.546018 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90724380-7f87-4ab9-955a-71f8c75db52f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.546549 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90724380-7f87-4ab9-955a-71f8c75db52f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.546573 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90724380-7f87-4ab9-955a-71f8c75db52f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.551549 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90724380-7f87-4ab9-955a-71f8c75db52f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.561061 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmqqp\" (UniqueName: \"kubernetes.io/projected/90724380-7f87-4ab9-955a-71f8c75db52f-kube-api-access-jmqqp\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.572218 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.572256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.572266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.572280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.572289 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.666563 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.674764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.674800 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.674811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.674826 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.674835 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: W0226 19:56:05.679058 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90724380_7f87_4ab9_955a_71f8c75db52f.slice/crio-6be27c455ad7a72c5cb2cbff6f951126071c558f7c1ba563471a21a4af09729f WatchSource:0}: Error finding container 6be27c455ad7a72c5cb2cbff6f951126071c558f7c1ba563471a21a4af09729f: Status 404 returned error can't find the container with id 6be27c455ad7a72c5cb2cbff6f951126071c558f7c1ba563471a21a4af09729f Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.777053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.777084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.777106 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.777120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.777129 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.879370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.879409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.879420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.879435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.879445 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.981487 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.981530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.981547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.981563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.981575 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.083385 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vmrpg"] Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.083896 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: E0226 19:56:06.083960 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.083971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.084003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.084013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.084027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.084037 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.103456 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.118074 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.129224 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.140269 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.151362 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.151606 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k65ww\" (UniqueName: \"kubernetes.io/projected/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-kube-api-access-k65ww\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.151658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.160200 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.173035 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.183249 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.186075 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.186151 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.186165 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.186183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.186196 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.193431 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.204067 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.214675 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.223803 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.233087 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.244245 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.251977 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k65ww\" (UniqueName: \"kubernetes.io/projected/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-kube-api-access-k65ww\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.252027 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: E0226 19:56:06.252120 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:06 crc kubenswrapper[4722]: E0226 19:56:06.252207 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:06.752191589 +0000 UTC m=+109.289159503 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.266886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k65ww\" (UniqueName: \"kubernetes.io/projected/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-kube-api-access-k65ww\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.273196 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.288818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.288850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.288859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.288872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.288882 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.391390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.391431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.391441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.391456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.391466 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.494010 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.494046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.494058 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.494071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.494080 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.596654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.596703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.596715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.596734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.596751 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.606252 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" event={"ID":"90724380-7f87-4ab9-955a-71f8c75db52f","Type":"ContainerStarted","Data":"9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.606297 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" event={"ID":"90724380-7f87-4ab9-955a-71f8c75db52f","Type":"ContainerStarted","Data":"d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.606307 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" event={"ID":"90724380-7f87-4ab9-955a-71f8c75db52f","Type":"ContainerStarted","Data":"6be27c455ad7a72c5cb2cbff6f951126071c558f7c1ba563471a21a4af09729f"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.623020 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.638960 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.651776 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.662385 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.671415 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.685701 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.699013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.699067 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.699084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.699105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.699122 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.702502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.713858 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.727378 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.745289 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.757778 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: E0226 19:56:06.757909 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:06 crc kubenswrapper[4722]: E0226 19:56:06.757965 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:07.757951062 +0000 UTC m=+110.294918986 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.760445 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.778617 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.787478 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.799043 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.801887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.801929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.801940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.801956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.801968 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.811730 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.905589 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.905624 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.905632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.905646 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.905655 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.007584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.007635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.007652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.007674 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.007689 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.110184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.110226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.110235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.110250 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.110259 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.145999 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.146011 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.146019 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:07 crc kubenswrapper[4722]: E0226 19:56:07.146227 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:07 crc kubenswrapper[4722]: E0226 19:56:07.146365 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:07 crc kubenswrapper[4722]: E0226 19:56:07.146555 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.212399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.212443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.212454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.212471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.212482 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.315049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.315095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.315107 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.315123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.315149 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.416850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.416893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.416901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.416916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.416925 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.519217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.519409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.519431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.519449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.519463 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.622427 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.622470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.622483 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.622497 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.622507 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.725017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.725057 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.725065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.725080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.725089 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.768819 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:07 crc kubenswrapper[4722]: E0226 19:56:07.768975 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:07 crc kubenswrapper[4722]: E0226 19:56:07.769066 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:09.76904528 +0000 UTC m=+112.306013204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.827038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.827079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.827087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.827110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.827119 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.929296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.929326 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.929334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.929347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.929356 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.032658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.032712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.032723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.032741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.032755 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.135570 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.135603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.135611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.135623 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.135634 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.145372 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.145493 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.163778 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.178024 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.189124 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.199261 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.213607 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.225955 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.237238 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.237297 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.237316 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.237340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.237358 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.244693 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.257583 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.274007 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.288694 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.302820 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.314020 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.335265 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.339550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.339581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.339589 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.339603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.339613 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.346195 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.356739 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.441830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.441871 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.441886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.441906 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.441922 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.538415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.538452 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.538465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.538480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.538490 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.549107 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.552661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.552701 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.552715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.552732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.552744 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.564343 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.568201 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.568256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.568267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.568285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.568299 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.584009 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.587933 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.587958 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.587966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.587981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.587990 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.604341 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.607974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.608007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.608015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.608029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.608043 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.619432 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.619541 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.620846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.620877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.620888 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.620901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.620910 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.723584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.723622 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.723632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.723648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.723660 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.826352 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.826405 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.826417 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.826435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.826447 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.929359 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.929413 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.929431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.929461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.929477 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.032127 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.032251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.032277 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.032306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.032329 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.134896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.135019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.135037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.135062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.135081 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.145431 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:09 crc kubenswrapper[4722]: E0226 19:56:09.145630 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.145653 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.145693 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:09 crc kubenswrapper[4722]: E0226 19:56:09.145861 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:09 crc kubenswrapper[4722]: E0226 19:56:09.145961 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.238453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.238511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.238527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.238546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.238559 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.341199 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.341261 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.341274 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.341310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.341323 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.444127 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.444196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.444211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.444228 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.444257 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.547077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.547124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.547158 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.547172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.547180 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.650084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.650151 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.650167 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.650184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.650195 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.752357 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.752392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.752405 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.752421 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.752432 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.789239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:09 crc kubenswrapper[4722]: E0226 19:56:09.789358 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:09 crc kubenswrapper[4722]: E0226 19:56:09.789424 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:13.789406204 +0000 UTC m=+116.326374138 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.854207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.854241 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.854254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.854268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.854280 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.956829 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.956937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.956956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.956980 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.956993 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.059811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.059898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.059921 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.059949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.059967 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.145439 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:10 crc kubenswrapper[4722]: E0226 19:56:10.145659 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.162660 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.162709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.162719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.162730 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.162739 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.265682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.265738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.265756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.265781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.265800 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.368353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.368474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.368495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.368518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.368538 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.471322 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.471407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.471432 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.471461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.471480 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.574500 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.574572 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.574584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.574622 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.574639 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.678049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.678095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.678104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.678119 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.678167 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.780460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.780512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.780527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.780545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.780556 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.883103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.883155 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.883187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.883200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.883209 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.985759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.985822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.985847 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.985881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.985902 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.088579 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.088618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.088627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.088640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.088650 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.145765 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.145821 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.145784 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:11 crc kubenswrapper[4722]: E0226 19:56:11.145951 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:11 crc kubenswrapper[4722]: E0226 19:56:11.146053 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:11 crc kubenswrapper[4722]: E0226 19:56:11.146171 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.191961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.191993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.192002 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.192016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.192026 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.295002 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.295036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.295044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.295059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.295068 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.398089 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.398230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.398248 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.398271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.398319 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.453632 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.468348 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.482570 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.498837 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.500922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.501002 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.501020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.501073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.501090 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.513478 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.526516 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.538230 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.558632 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.570539 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.583358 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.595576 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.604709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.604756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.604772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.604793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.604806 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.608482 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.621608 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.632985 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.647692 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.666758 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.708940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.709008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.709027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.709054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.709074 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.812045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.812117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.812168 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.812201 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.812222 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.914704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.914781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.914808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.914840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.914862 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.019010 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.019098 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.019120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.019203 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.019232 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.122828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.122901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.122919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.122945 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.122964 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.145654 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:12 crc kubenswrapper[4722]: E0226 19:56:12.145997 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.225993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.226060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.226079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.226105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.226124 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.329738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.329839 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.329866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.329893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.329916 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.432930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.432992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.433015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.433041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.433059 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.535329 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.535380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.535391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.535411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.535421 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.638363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.638414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.638428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.638446 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.638460 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.740712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.740747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.740758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.740772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.740782 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.843342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.843428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.843453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.843485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.843511 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.946457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.946495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.946510 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.946531 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.946553 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.049859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.049938 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.049955 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.049979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.049996 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.145883 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:13 crc kubenswrapper[4722]: E0226 19:56:13.146299 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.145951 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:13 crc kubenswrapper[4722]: E0226 19:56:13.146591 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.145950 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:13 crc kubenswrapper[4722]: E0226 19:56:13.146829 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.153267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.153342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.153407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.153451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.153475 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.256823 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.257112 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.257269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.257414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.257513 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.360874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.360937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.360961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.360993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.361014 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.463042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.463100 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.463121 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.463179 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.463198 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.566220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.566329 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.566351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.566378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.566397 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.669654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.669708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.669721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.669738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.669751 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.772316 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.772415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.772435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.772461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.772479 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.834003 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:13 crc kubenswrapper[4722]: E0226 19:56:13.834267 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:13 crc kubenswrapper[4722]: E0226 19:56:13.834347 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:21.834322597 +0000 UTC m=+124.371290531 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.875834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.875876 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.875894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.875913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.875927 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.979041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.979375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.979388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.979405 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.979421 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.081721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.081772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.081784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.081802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.081814 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.145518 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:14 crc kubenswrapper[4722]: E0226 19:56:14.145795 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.184201 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.184282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.184307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.184338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.184365 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.287807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.287868 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.287886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.287914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.287933 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.390472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.390542 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.390567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.390593 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.390612 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.493648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.493694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.493705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.493723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.493736 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.596314 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.596368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.596385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.596408 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.596426 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.699244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.699307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.699327 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.699353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.699370 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.802085 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.802197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.802220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.802251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.802275 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.905077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.905173 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.905194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.905229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.905252 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.008175 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.008247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.008265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.008292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.008310 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.111001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.111080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.111104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.111185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.111231 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.145941 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:15 crc kubenswrapper[4722]: E0226 19:56:15.146227 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.146411 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:15 crc kubenswrapper[4722]: E0226 19:56:15.146610 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.146739 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:15 crc kubenswrapper[4722]: E0226 19:56:15.146806 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.214983 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.215036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.215049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.215068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.215082 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.318441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.318511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.318532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.318565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.318586 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.421381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.421418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.421429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.421444 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.421455 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.524207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.524246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.524256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.524270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.524281 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.626881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.626910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.626921 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.626935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.626945 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.730045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.730083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.730093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.730110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.730120 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.838170 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.838458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.838754 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.838776 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.838789 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.941529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.941572 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.941580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.941596 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.941605 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.043237 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.043309 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.043331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.043362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.043382 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.145030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:16 crc kubenswrapper[4722]: E0226 19:56:16.145173 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.145990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.146016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.146024 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.146038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.146049 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.248594 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.248666 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.248684 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.248709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.248727 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.351387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.351430 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.351442 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.351457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.351467 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.454604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.454666 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.454682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.454705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.454721 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.556782 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.556823 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.556834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.556850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.556863 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.658811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.658840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.658849 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.658861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.658870 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.762272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.762319 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.762331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.762348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.762361 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.865892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.865931 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.865941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.865964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.865975 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.968267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.968318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.968334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.968355 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.968370 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.071296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.071350 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.071362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.071380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.071390 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.145512 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.145537 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.145621 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:17 crc kubenswrapper[4722]: E0226 19:56:17.145724 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:17 crc kubenswrapper[4722]: E0226 19:56:17.145828 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:17 crc kubenswrapper[4722]: E0226 19:56:17.145900 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.146997 4722 scope.go:117] "RemoveContainer" containerID="1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.173875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.173950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.173967 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.173989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.174052 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.277022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.277310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.277319 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.277344 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.277355 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.379395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.379462 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.379477 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.379495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.379507 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.481939 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.482004 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.482018 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.482045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.482057 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.584655 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.584705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.584714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.584730 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.584740 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.641696 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/1.log" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.643940 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.644349 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.655820 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.668598 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.679799 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.686819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.686859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.686870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.686887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.686898 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.689732 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.703713 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.720242 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.730109 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.744026 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.763444 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.774552 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.785316 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.792858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.792901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.793838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.793857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.793867 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.800012 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.812443 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.823937 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.834193 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.896227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.896271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.896283 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.896301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.896316 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.998636 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.998729 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.998740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.998761 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.998771 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.099684 4722 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.144880 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.145008 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.156115 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.170462 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.181554 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.191615 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.203716 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.215803 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.231298 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.237027 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.247341 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.257598 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.268204 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.277757 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.289009 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.298782 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.315844 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.332376 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.650040 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/2.log" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.650842 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/1.log" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.654301 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd" exitCode=1 Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.654346 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd"} Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.654409 4722 scope.go:117] "RemoveContainer" containerID="1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.656228 4722 scope.go:117] "RemoveContainer" containerID="c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd" Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.656661 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.669924 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.693431 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.710263 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.724401 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.740743 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.756394 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.780206 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.792417 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.808647 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.827756 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.844654 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.859006 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.869921 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.869970 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.869982 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.870005 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.870024 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:18Z","lastTransitionTime":"2026-02-26T19:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.873201 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.888194 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.892516 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.893010 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.893073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.893095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.893119 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.893165 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:18Z","lastTransitionTime":"2026-02-26T19:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.908220 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.913087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.913173 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.913196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.913222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.913242 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:18Z","lastTransitionTime":"2026-02-26T19:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.917294 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.929203 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.934384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.934435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.934453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.934477 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.934495 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:18Z","lastTransitionTime":"2026-02-26T19:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.952445 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.956525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.956575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.956590 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.956613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.956628 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:18Z","lastTransitionTime":"2026-02-26T19:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.969833 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.969982 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.145228 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.145298 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.145255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:19 crc kubenswrapper[4722]: E0226 19:56:19.145399 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:19 crc kubenswrapper[4722]: E0226 19:56:19.145529 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:19 crc kubenswrapper[4722]: E0226 19:56:19.145660 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.661820 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/2.log" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.665838 4722 scope.go:117] "RemoveContainer" containerID="c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd" Feb 26 19:56:19 crc kubenswrapper[4722]: E0226 19:56:19.666024 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.686827 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.701791 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.720584 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.738353 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.751345 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.765635 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.788217 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.801939 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.815323 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.829626 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.842422 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.860565 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.876782 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.890645 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.904412 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:20 crc kubenswrapper[4722]: I0226 19:56:20.146030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:20 crc kubenswrapper[4722]: E0226 19:56:20.146273 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.029794 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.029990 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:56:53.029954167 +0000 UTC m=+155.566922141 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.130526 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.130594 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.130642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130698 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130721 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130733 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130773 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.130701 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130782 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:53.130765693 +0000 UTC m=+155.667733617 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130880 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130932 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130960 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130903 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:53.130873456 +0000 UTC m=+155.667841420 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.131062 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:53.13103652 +0000 UTC m=+155.668004474 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.131191 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.131250 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:53.131236145 +0000 UTC m=+155.668204109 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.145383 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.145413 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.145430 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.145494 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.145577 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.145649 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.837828 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.838014 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.838103 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:37.838078065 +0000 UTC m=+140.375046029 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:22 crc kubenswrapper[4722]: I0226 19:56:22.145512 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:22 crc kubenswrapper[4722]: E0226 19:56:22.145980 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:22 crc kubenswrapper[4722]: I0226 19:56:22.163865 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 26 19:56:23 crc kubenswrapper[4722]: I0226 19:56:23.165981 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:23 crc kubenswrapper[4722]: I0226 19:56:23.166061 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:23 crc kubenswrapper[4722]: E0226 19:56:23.166106 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:23 crc kubenswrapper[4722]: I0226 19:56:23.165991 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:23 crc kubenswrapper[4722]: E0226 19:56:23.166253 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:23 crc kubenswrapper[4722]: E0226 19:56:23.166404 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:23 crc kubenswrapper[4722]: E0226 19:56:23.232997 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:24 crc kubenswrapper[4722]: I0226 19:56:24.145504 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:24 crc kubenswrapper[4722]: E0226 19:56:24.146061 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:25 crc kubenswrapper[4722]: I0226 19:56:25.145408 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:25 crc kubenswrapper[4722]: I0226 19:56:25.145478 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:25 crc kubenswrapper[4722]: I0226 19:56:25.145649 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:25 crc kubenswrapper[4722]: E0226 19:56:25.145935 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:25 crc kubenswrapper[4722]: E0226 19:56:25.146288 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:25 crc kubenswrapper[4722]: E0226 19:56:25.146373 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:25 crc kubenswrapper[4722]: I0226 19:56:25.157523 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 26 19:56:26 crc kubenswrapper[4722]: I0226 19:56:26.145294 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:26 crc kubenswrapper[4722]: E0226 19:56:26.145464 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:27 crc kubenswrapper[4722]: I0226 19:56:27.145687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:27 crc kubenswrapper[4722]: I0226 19:56:27.145765 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:27 crc kubenswrapper[4722]: E0226 19:56:27.145868 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:27 crc kubenswrapper[4722]: E0226 19:56:27.145985 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:27 crc kubenswrapper[4722]: I0226 19:56:27.146092 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:27 crc kubenswrapper[4722]: E0226 19:56:27.146256 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.145583 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:28 crc kubenswrapper[4722]: E0226 19:56:28.145807 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.165920 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.182814 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.201816 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.213593 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.224672 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: E0226 19:56:28.233477 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.237483 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.250265 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.262751 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.272875 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.282248 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.294270 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.306926 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.319514 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.328793 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.339201 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.349771 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.360207 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.145661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.145730 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.145788 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.145801 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.146028 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.146279 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.291386 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.291456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.291467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.291489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.291500 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:29Z","lastTransitionTime":"2026-02-26T19:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.302362 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:29Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.305537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.305587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.305599 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.305616 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.305628 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:29Z","lastTransitionTime":"2026-02-26T19:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.316407 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:29Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.319991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.320016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.320025 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.320037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.320048 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:29Z","lastTransitionTime":"2026-02-26T19:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.333492 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:29Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.337809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.337859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.337880 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.337902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.337913 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:29Z","lastTransitionTime":"2026-02-26T19:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.349318 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:29Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.352411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.352456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.352466 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.352481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.352491 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:29Z","lastTransitionTime":"2026-02-26T19:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.363960 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:29Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.364072 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:56:30 crc kubenswrapper[4722]: I0226 19:56:30.145777 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:30 crc kubenswrapper[4722]: E0226 19:56:30.146001 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:31 crc kubenswrapper[4722]: I0226 19:56:31.146041 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:31 crc kubenswrapper[4722]: I0226 19:56:31.146047 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:31 crc kubenswrapper[4722]: I0226 19:56:31.146215 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:31 crc kubenswrapper[4722]: E0226 19:56:31.146877 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:31 crc kubenswrapper[4722]: E0226 19:56:31.147001 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:31 crc kubenswrapper[4722]: E0226 19:56:31.147217 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:31 crc kubenswrapper[4722]: I0226 19:56:31.147338 4722 scope.go:117] "RemoveContainer" containerID="c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd" Feb 26 19:56:31 crc kubenswrapper[4722]: E0226 19:56:31.147586 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:32 crc kubenswrapper[4722]: I0226 19:56:32.145038 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:32 crc kubenswrapper[4722]: E0226 19:56:32.145515 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:33 crc kubenswrapper[4722]: I0226 19:56:33.145059 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:33 crc kubenswrapper[4722]: I0226 19:56:33.145059 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:33 crc kubenswrapper[4722]: I0226 19:56:33.145080 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:33 crc kubenswrapper[4722]: E0226 19:56:33.145386 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:33 crc kubenswrapper[4722]: E0226 19:56:33.145223 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:33 crc kubenswrapper[4722]: E0226 19:56:33.145437 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:33 crc kubenswrapper[4722]: E0226 19:56:33.234353 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:34 crc kubenswrapper[4722]: I0226 19:56:34.144971 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:34 crc kubenswrapper[4722]: E0226 19:56:34.145118 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:35 crc kubenswrapper[4722]: I0226 19:56:35.145636 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:35 crc kubenswrapper[4722]: E0226 19:56:35.146449 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:35 crc kubenswrapper[4722]: I0226 19:56:35.145751 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:35 crc kubenswrapper[4722]: E0226 19:56:35.146571 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:35 crc kubenswrapper[4722]: I0226 19:56:35.145696 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:35 crc kubenswrapper[4722]: E0226 19:56:35.146942 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:36 crc kubenswrapper[4722]: I0226 19:56:36.146239 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:36 crc kubenswrapper[4722]: E0226 19:56:36.146399 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:37 crc kubenswrapper[4722]: I0226 19:56:37.145345 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:37 crc kubenswrapper[4722]: I0226 19:56:37.145366 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:37 crc kubenswrapper[4722]: E0226 19:56:37.145589 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:37 crc kubenswrapper[4722]: E0226 19:56:37.145671 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:37 crc kubenswrapper[4722]: I0226 19:56:37.145860 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:37 crc kubenswrapper[4722]: E0226 19:56:37.146073 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:37 crc kubenswrapper[4722]: I0226 19:56:37.907625 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:37 crc kubenswrapper[4722]: E0226 19:56:37.907805 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:37 crc kubenswrapper[4722]: E0226 19:56:37.907913 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:57:09.907886193 +0000 UTC m=+172.444854147 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.145858 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:38 crc kubenswrapper[4722]: E0226 19:56:38.146063 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.166052 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.185427 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.202793 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.218486 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: E0226 19:56:38.235124 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.237132 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.257975 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.280346 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.297281 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.314999 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.340357 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.358587 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.372361 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.389055 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.406594 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.424504 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.440797 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.456622 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.145650 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.145760 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.145799 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.145837 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.145980 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.146065 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.698977 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.699027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.699038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.699060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.699071 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:39Z","lastTransitionTime":"2026-02-26T19:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.713783 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:39Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.717699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.717755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.717777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.717804 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.717824 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:39Z","lastTransitionTime":"2026-02-26T19:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.741752 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:39Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.747989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.748041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.748057 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.748080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.748095 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:39Z","lastTransitionTime":"2026-02-26T19:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.766856 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:39Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.771797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.771861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.771882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.771907 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.771924 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:39Z","lastTransitionTime":"2026-02-26T19:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.789741 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:39Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.793336 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.793379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.793394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.793412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.793423 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:39Z","lastTransitionTime":"2026-02-26T19:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.806845 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:39Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.806952 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.145502 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:40 crc kubenswrapper[4722]: E0226 19:56:40.145718 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.735108 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/0.log" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.735233 4722 generic.go:334] "Generic (PLEG): container finished" podID="2bb99326-dd22-4186-84da-ba208f104cd6" containerID="0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855" exitCode=1 Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.735328 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cfwh9" event={"ID":"2bb99326-dd22-4186-84da-ba208f104cd6","Type":"ContainerDied","Data":"0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855"} Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.735886 4722 scope.go:117] "RemoveContainer" containerID="0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.755050 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.783573 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.793984 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.809333 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.822549 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.835845 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.850827 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.871991 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.887950 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.903320 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.914648 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.924237 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.931777 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.940798 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.949676 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.960696 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.976301 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.145211 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.145478 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.145719 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:41 crc kubenswrapper[4722]: E0226 19:56:41.145794 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:41 crc kubenswrapper[4722]: E0226 19:56:41.145721 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:41 crc kubenswrapper[4722]: E0226 19:56:41.146021 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.741128 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/0.log" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.742222 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cfwh9" event={"ID":"2bb99326-dd22-4186-84da-ba208f104cd6","Type":"ContainerStarted","Data":"9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097"} Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.764378 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.785327 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.804491 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.819009 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.838430 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.861566 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.879591 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.895921 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.917375 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.936452 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.953587 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.972072 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.992606 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.016361 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.051485 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.067930 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.084990 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.147117 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.147456 4722 scope.go:117] "RemoveContainer" containerID="c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd" Feb 26 19:56:42 crc kubenswrapper[4722]: E0226 19:56:42.147478 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.748008 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/2.log" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.751307 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.751652 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.767781 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.781662 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.799220 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.810467 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.821276 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.832643 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.843556 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.852720 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.862560 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.874946 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.885655 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.897702 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.912908 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.924309 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.938975 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.953598 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.966937 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.145825 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.145835 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:43 crc kubenswrapper[4722]: E0226 19:56:43.146024 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:43 crc kubenswrapper[4722]: E0226 19:56:43.146171 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.145835 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:43 crc kubenswrapper[4722]: E0226 19:56:43.146266 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:43 crc kubenswrapper[4722]: E0226 19:56:43.236519 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.759344 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/3.log" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.760949 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/2.log" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.765700 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" exitCode=1 Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.765760 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.765846 4722 scope.go:117] "RemoveContainer" containerID="c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.766876 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 19:56:43 crc kubenswrapper[4722]: E0226 19:56:43.767183 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.788055 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.810578 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.831936 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.862180 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:43Z\\\",\\\"message\\\":\\\"ontroller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.194262ms\\\\nI0226 19:56:43.013777 7237 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:43.013764 7237 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:43.012911 7237 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013806 7237 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013821 7237 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0226 19:56:43.013834 7237 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.874694 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.891275 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.909334 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.925801 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.941483 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.961465 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.976174 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.994928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.014633 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.030634 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.048406 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.065238 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.083306 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.145695 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:44 crc kubenswrapper[4722]: E0226 19:56:44.146036 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.164706 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.771970 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/3.log" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.776828 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 19:56:44 crc kubenswrapper[4722]: E0226 19:56:44.777103 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.790209 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.814192 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.833506 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.854166 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.870047 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.889497 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.905276 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.922787 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.934791 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.950857 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.970951 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.986230 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.999789 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.015114 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:45Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.035535 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1409e7f-8552-4e52-bda9-a08fb020f087\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde6433c3955ab42d8bf834f7508824e80021ad2d4cb47a9b0ae35482615caa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:45Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.051444 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:45Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.067091 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:45Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.088581 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:43Z\\\",\\\"message\\\":\\\"ontroller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.194262ms\\\\nI0226 19:56:43.013777 7237 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:43.013764 7237 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:43.012911 7237 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013806 7237 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013821 7237 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0226 19:56:43.013834 7237 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:45Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.145720 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.145762 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.145738 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:45 crc kubenswrapper[4722]: E0226 19:56:45.145844 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:45 crc kubenswrapper[4722]: E0226 19:56:45.146070 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:45 crc kubenswrapper[4722]: E0226 19:56:45.146120 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:46 crc kubenswrapper[4722]: I0226 19:56:46.145274 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:46 crc kubenswrapper[4722]: E0226 19:56:46.145486 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:47 crc kubenswrapper[4722]: I0226 19:56:47.145643 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:47 crc kubenswrapper[4722]: I0226 19:56:47.145675 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:47 crc kubenswrapper[4722]: E0226 19:56:47.145807 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:47 crc kubenswrapper[4722]: I0226 19:56:47.146003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:47 crc kubenswrapper[4722]: E0226 19:56:47.146070 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:47 crc kubenswrapper[4722]: E0226 19:56:47.146246 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.145403 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:48 crc kubenswrapper[4722]: E0226 19:56:48.145689 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.169617 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.183852 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.202248 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.214592 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.225885 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: E0226 19:56:48.237015 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.237533 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1409e7f-8552-4e52-bda9-a08fb020f087\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde6433c3955ab42d8bf834f7508824e80021ad2d4cb47a9b0ae35482615caa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.252898 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.263561 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.282102 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:43Z\\\",\\\"message\\\":\\\"ontroller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.194262ms\\\\nI0226 19:56:43.013777 7237 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:43.013764 7237 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:43.012911 7237 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013806 7237 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013821 7237 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0226 19:56:43.013834 7237 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.292557 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.302498 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.314118 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.331833 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.343031 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.354290 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.364086 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.382845 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.400017 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:49 crc kubenswrapper[4722]: I0226 19:56:49.145753 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:49 crc kubenswrapper[4722]: E0226 19:56:49.145858 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:49 crc kubenswrapper[4722]: I0226 19:56:49.146032 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:49 crc kubenswrapper[4722]: E0226 19:56:49.146078 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:49 crc kubenswrapper[4722]: I0226 19:56:49.146190 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:49 crc kubenswrapper[4722]: E0226 19:56:49.146233 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.019740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.020211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.020381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.020558 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.020699 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:50Z","lastTransitionTime":"2026-02-26T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.044463 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.049808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.049856 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.049872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.049891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.049905 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:50Z","lastTransitionTime":"2026-02-26T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.071868 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.079357 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.079414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.079432 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.079457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.079479 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:50Z","lastTransitionTime":"2026-02-26T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.101633 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.106116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.106183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.106196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.106215 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.106227 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:50Z","lastTransitionTime":"2026-02-26T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.125028 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.129459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.129522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.129539 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.129566 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.129585 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:50Z","lastTransitionTime":"2026-02-26T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.145464 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.145627 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.150070 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.150522 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:56:51 crc kubenswrapper[4722]: I0226 19:56:51.145301 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:51 crc kubenswrapper[4722]: I0226 19:56:51.145342 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:51 crc kubenswrapper[4722]: I0226 19:56:51.145301 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:51 crc kubenswrapper[4722]: E0226 19:56:51.145512 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:51 crc kubenswrapper[4722]: E0226 19:56:51.145782 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:51 crc kubenswrapper[4722]: E0226 19:56:51.145952 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:52 crc kubenswrapper[4722]: I0226 19:56:52.145398 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:52 crc kubenswrapper[4722]: E0226 19:56:52.145592 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.103226 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.103502 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:57.103454218 +0000 UTC m=+219.640422182 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.144899 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.144907 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.144970 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.145236 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.145397 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.145551 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.161200 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.204209 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.204260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.204295 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.204318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204437 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204436 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204493 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204518 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204539 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204609 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:57:57.204577494 +0000 UTC m=+219.741545458 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204615 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204670 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:57:57.204631085 +0000 UTC m=+219.741599059 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204451 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204709 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:57:57.204690257 +0000 UTC m=+219.741658261 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204725 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204805 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:57:57.204779869 +0000 UTC m=+219.741747833 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.238743 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:54 crc kubenswrapper[4722]: I0226 19:56:54.145208 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:54 crc kubenswrapper[4722]: E0226 19:56:54.145413 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:55 crc kubenswrapper[4722]: I0226 19:56:55.145206 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:55 crc kubenswrapper[4722]: I0226 19:56:55.145293 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:55 crc kubenswrapper[4722]: I0226 19:56:55.145324 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:55 crc kubenswrapper[4722]: E0226 19:56:55.145906 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:55 crc kubenswrapper[4722]: E0226 19:56:55.145714 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:55 crc kubenswrapper[4722]: E0226 19:56:55.146022 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:56 crc kubenswrapper[4722]: I0226 19:56:56.145899 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:56 crc kubenswrapper[4722]: E0226 19:56:56.146060 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:57 crc kubenswrapper[4722]: I0226 19:56:57.145281 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:57 crc kubenswrapper[4722]: I0226 19:56:57.145299 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:57 crc kubenswrapper[4722]: E0226 19:56:57.145517 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:57 crc kubenswrapper[4722]: I0226 19:56:57.145582 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:57 crc kubenswrapper[4722]: E0226 19:56:57.145774 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:57 crc kubenswrapper[4722]: E0226 19:56:57.145909 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.145678 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:58 crc kubenswrapper[4722]: E0226 19:56:58.145955 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.170100 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.186514 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.202695 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.217916 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.228646 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: E0226 19:56:58.239777 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.250838 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.268975 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.288055 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:43Z\\\",\\\"message\\\":\\\"ontroller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.194262ms\\\\nI0226 19:56:43.013777 7237 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:43.013764 7237 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:43.012911 7237 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013806 7237 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013821 7237 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0226 19:56:43.013834 7237 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.299426 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.311168 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.322494 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1409e7f-8552-4e52-bda9-a08fb020f087\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde6433c3955ab42d8bf834f7508824e80021ad2d4cb47a9b0ae35482615caa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.379673 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7af42c5-ca4e-4187-8378-daba58768af4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c4b07a88f55918dbcd7136aaf157af63386ad3c03605a48bf45c27d8defb79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d23a64bcdecb7b3c3af4e5b3b6ebbeeabde099fcbc9ffe6c844913e53b3889\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9073a1c88735e9e00c2332d6615d61dfa4794cb89be27db10df29ccf0614dc41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ba4ffc96221354be83ab1d9dc2e9f7d362d6cdc22315d0f8d880f063131d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03310a3fe7e38b4a89ded37ad392faa9e07f5cf7a261d5cb34625013d4856608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c025095a190a876bfdbf6f1e74875ec58cf72c1b83fdf9f26d75eebf09ea6fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c025095a190a876bfdbf6f1e74875ec58cf72c1b83fdf9f26d75eebf09ea6fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a86f27f511be84a6e6519a11f7c2833e146be2b90cfa0f1228ffed32ce1615e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a86f27f511be84a6e6519a11f7c2833e146be2b90cfa0f1228ffed32ce1615e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://be248bf43817975c22081d959ba6543f23a058ea87663922abfa721de25c5410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be248bf43817975c22081d959ba6543f23a058ea87663922abfa721de25c5410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.392755 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.407714 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.420063 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.435743 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.450559 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.461983 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.474931 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:59 crc kubenswrapper[4722]: I0226 19:56:59.144993 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:59 crc kubenswrapper[4722]: I0226 19:56:59.145127 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:59 crc kubenswrapper[4722]: E0226 19:56:59.145205 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:59 crc kubenswrapper[4722]: I0226 19:56:59.145268 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:59 crc kubenswrapper[4722]: E0226 19:56:59.145372 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:59 crc kubenswrapper[4722]: E0226 19:56:59.145499 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:59 crc kubenswrapper[4722]: I0226 19:56:59.146897 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 19:56:59 crc kubenswrapper[4722]: E0226 19:56:59.147242 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.145507 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.145637 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.281116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.281183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.281195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.281212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.281224 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:57:00Z","lastTransitionTime":"2026-02-26T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.294835 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:57:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.298202 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.298251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.298267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.298288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.298302 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:57:00Z","lastTransitionTime":"2026-02-26T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.308955 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:57:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.312470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.312546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.312569 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.312600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.312618 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:57:00Z","lastTransitionTime":"2026-02-26T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.327111 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:57:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.331511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.331547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.331559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.331577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.331590 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:57:00Z","lastTransitionTime":"2026-02-26T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.344629 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:57:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.348414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.348441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.348449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.348461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.348471 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:57:00Z","lastTransitionTime":"2026-02-26T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.366840 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:57:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.366943 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:57:01 crc kubenswrapper[4722]: I0226 19:57:01.145495 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:01 crc kubenswrapper[4722]: I0226 19:57:01.145557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:01 crc kubenswrapper[4722]: I0226 19:57:01.145513 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:01 crc kubenswrapper[4722]: E0226 19:57:01.145680 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:01 crc kubenswrapper[4722]: E0226 19:57:01.145833 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:01 crc kubenswrapper[4722]: E0226 19:57:01.145906 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:02 crc kubenswrapper[4722]: I0226 19:57:02.145361 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:02 crc kubenswrapper[4722]: E0226 19:57:02.145580 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:03 crc kubenswrapper[4722]: I0226 19:57:03.145307 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:03 crc kubenswrapper[4722]: I0226 19:57:03.145322 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:03 crc kubenswrapper[4722]: E0226 19:57:03.145463 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:03 crc kubenswrapper[4722]: E0226 19:57:03.145504 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:03 crc kubenswrapper[4722]: I0226 19:57:03.145335 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:03 crc kubenswrapper[4722]: E0226 19:57:03.145567 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:03 crc kubenswrapper[4722]: E0226 19:57:03.241259 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:57:04 crc kubenswrapper[4722]: I0226 19:57:04.146091 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:04 crc kubenswrapper[4722]: E0226 19:57:04.146346 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:05 crc kubenswrapper[4722]: I0226 19:57:05.144976 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:05 crc kubenswrapper[4722]: I0226 19:57:05.144997 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:05 crc kubenswrapper[4722]: E0226 19:57:05.145247 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:05 crc kubenswrapper[4722]: I0226 19:57:05.145020 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:05 crc kubenswrapper[4722]: E0226 19:57:05.145367 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:05 crc kubenswrapper[4722]: E0226 19:57:05.145597 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:06 crc kubenswrapper[4722]: I0226 19:57:06.145398 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:06 crc kubenswrapper[4722]: E0226 19:57:06.145559 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:07 crc kubenswrapper[4722]: I0226 19:57:07.145459 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:07 crc kubenswrapper[4722]: I0226 19:57:07.145483 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:07 crc kubenswrapper[4722]: E0226 19:57:07.145603 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:07 crc kubenswrapper[4722]: I0226 19:57:07.145742 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:07 crc kubenswrapper[4722]: E0226 19:57:07.145870 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:07 crc kubenswrapper[4722]: E0226 19:57:07.146020 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.145394 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:08 crc kubenswrapper[4722]: E0226 19:57:08.145610 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.159571 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:57:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.185389 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=24.18537528 podStartE2EDuration="24.18537528s" podCreationTimestamp="2026-02-26 19:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.185106832 +0000 UTC m=+170.722074786" watchObservedRunningTime="2026-02-26 19:57:08.18537528 +0000 UTC m=+170.722343204" Feb 26 19:57:08 crc kubenswrapper[4722]: E0226 19:57:08.241915 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.259657 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=15.259635471 podStartE2EDuration="15.259635471s" podCreationTimestamp="2026-02-26 19:56:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.234075024 +0000 UTC m=+170.771042978" watchObservedRunningTime="2026-02-26 19:57:08.259635471 +0000 UTC m=+170.796603395" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.276997 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=46.276968084 podStartE2EDuration="46.276968084s" podCreationTimestamp="2026-02-26 19:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.259910889 +0000 UTC m=+170.796878853" watchObservedRunningTime="2026-02-26 19:57:08.276968084 +0000 UTC m=+170.813936048" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.319050 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pkptb" podStartSLOduration=122.3190263 podStartE2EDuration="2m2.3190263s" podCreationTimestamp="2026-02-26 19:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.318731292 +0000 UTC m=+170.855699236" watchObservedRunningTime="2026-02-26 19:57:08.3190263 +0000 UTC m=+170.855994264" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.344260 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p2glm" podStartSLOduration=121.344243767 podStartE2EDuration="2m1.344243767s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.343109935 +0000 UTC m=+170.880077859" watchObservedRunningTime="2026-02-26 19:57:08.344243767 +0000 UTC m=+170.881211691" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.406801 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.406757366 podStartE2EDuration="43.406757366s" podCreationTimestamp="2026-02-26 19:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.393748055 +0000 UTC m=+170.930715979" watchObservedRunningTime="2026-02-26 19:57:08.406757366 +0000 UTC m=+170.943725290" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.444214 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-glv66" podStartSLOduration=122.44418387 podStartE2EDuration="2m2.44418387s" podCreationTimestamp="2026-02-26 19:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.443606623 +0000 UTC m=+170.980574567" watchObservedRunningTime="2026-02-26 19:57:08.44418387 +0000 UTC m=+170.981151844" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.463163 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cfwh9" podStartSLOduration=121.463127068 podStartE2EDuration="2m1.463127068s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.461992886 +0000 UTC m=+170.998960820" watchObservedRunningTime="2026-02-26 19:57:08.463127068 +0000 UTC m=+171.000095012" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.509249 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=67.509233619 podStartE2EDuration="1m7.509233619s" podCreationTimestamp="2026-02-26 19:56:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.508970382 +0000 UTC m=+171.045938306" watchObservedRunningTime="2026-02-26 19:57:08.509233619 +0000 UTC m=+171.046201543" Feb 26 19:57:09 crc kubenswrapper[4722]: I0226 19:57:09.145363 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:09 crc kubenswrapper[4722]: I0226 19:57:09.145508 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:09 crc kubenswrapper[4722]: E0226 19:57:09.145537 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:09 crc kubenswrapper[4722]: E0226 19:57:09.145653 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:09 crc kubenswrapper[4722]: I0226 19:57:09.145510 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:09 crc kubenswrapper[4722]: E0226 19:57:09.145751 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:09 crc kubenswrapper[4722]: I0226 19:57:09.976288 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:09 crc kubenswrapper[4722]: E0226 19:57:09.976494 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:57:09 crc kubenswrapper[4722]: E0226 19:57:09.976724 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:58:13.976705501 +0000 UTC m=+236.513673445 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.146014 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:10 crc kubenswrapper[4722]: E0226 19:57:10.146311 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.632724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.632776 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.632788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.632805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.632816 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:57:10Z","lastTransitionTime":"2026-02-26T19:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.692082 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podStartSLOduration=123.692057414 podStartE2EDuration="2m3.692057414s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.562951758 +0000 UTC m=+171.099919682" watchObservedRunningTime="2026-02-26 19:57:10.692057414 +0000 UTC m=+173.229025378" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.693320 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s"] Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.693837 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.696573 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.696933 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.697381 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.697460 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.713556 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" podStartSLOduration=123.713536734 podStartE2EDuration="2m3.713536734s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:10.712975099 +0000 UTC m=+173.249943043" watchObservedRunningTime="2026-02-26 19:57:10.713536734 +0000 UTC m=+173.250504668" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.784787 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/36a34f49-185a-413f-80e6-25bb23108c78-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.784847 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/36a34f49-185a-413f-80e6-25bb23108c78-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.784873 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36a34f49-185a-413f-80e6-25bb23108c78-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.784916 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36a34f49-185a-413f-80e6-25bb23108c78-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.784961 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36a34f49-185a-413f-80e6-25bb23108c78-service-ca\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.886213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36a34f49-185a-413f-80e6-25bb23108c78-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.886281 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36a34f49-185a-413f-80e6-25bb23108c78-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.887096 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36a34f49-185a-413f-80e6-25bb23108c78-service-ca\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.887180 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/36a34f49-185a-413f-80e6-25bb23108c78-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.887227 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/36a34f49-185a-413f-80e6-25bb23108c78-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.887289 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/36a34f49-185a-413f-80e6-25bb23108c78-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.887319 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/36a34f49-185a-413f-80e6-25bb23108c78-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.887341 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36a34f49-185a-413f-80e6-25bb23108c78-service-ca\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.894930 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36a34f49-185a-413f-80e6-25bb23108c78-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.929371 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36a34f49-185a-413f-80e6-25bb23108c78-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.021690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:11 crc kubenswrapper[4722]: W0226 19:57:11.045089 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36a34f49_185a_413f_80e6_25bb23108c78.slice/crio-c6bba9aa495fb7df8d1a0f5c05f5338258220635a5d1c89be4b2f97173180e26 WatchSource:0}: Error finding container c6bba9aa495fb7df8d1a0f5c05f5338258220635a5d1c89be4b2f97173180e26: Status 404 returned error can't find the container with id c6bba9aa495fb7df8d1a0f5c05f5338258220635a5d1c89be4b2f97173180e26 Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.145687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:11 crc kubenswrapper[4722]: E0226 19:57:11.146189 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.146282 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.145682 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:11 crc kubenswrapper[4722]: E0226 19:57:11.147166 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:11 crc kubenswrapper[4722]: E0226 19:57:11.147286 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.194280 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.200902 4722 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.868603 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" event={"ID":"36a34f49-185a-413f-80e6-25bb23108c78","Type":"ContainerStarted","Data":"dcf517fbd501c23d7fe99e3f21611ac6463f4beb3263fe3818773fbdee89ea20"} Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.868655 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" event={"ID":"36a34f49-185a-413f-80e6-25bb23108c78","Type":"ContainerStarted","Data":"c6bba9aa495fb7df8d1a0f5c05f5338258220635a5d1c89be4b2f97173180e26"} Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.886063 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" podStartSLOduration=124.886039098 podStartE2EDuration="2m4.886039098s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:11.885290606 +0000 UTC m=+174.422258610" watchObservedRunningTime="2026-02-26 19:57:11.886039098 +0000 UTC m=+174.423007092" Feb 26 19:57:12 crc kubenswrapper[4722]: I0226 19:57:12.145642 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:12 crc kubenswrapper[4722]: E0226 19:57:12.145788 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:13 crc kubenswrapper[4722]: I0226 19:57:13.145231 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:13 crc kubenswrapper[4722]: I0226 19:57:13.145257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:13 crc kubenswrapper[4722]: I0226 19:57:13.145344 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:13 crc kubenswrapper[4722]: E0226 19:57:13.145468 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:13 crc kubenswrapper[4722]: E0226 19:57:13.145820 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:13 crc kubenswrapper[4722]: E0226 19:57:13.146555 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:13 crc kubenswrapper[4722]: I0226 19:57:13.146958 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 19:57:13 crc kubenswrapper[4722]: E0226 19:57:13.147238 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:57:13 crc kubenswrapper[4722]: E0226 19:57:13.243720 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:57:14 crc kubenswrapper[4722]: I0226 19:57:14.145340 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:14 crc kubenswrapper[4722]: E0226 19:57:14.146062 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:15 crc kubenswrapper[4722]: I0226 19:57:15.145238 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:15 crc kubenswrapper[4722]: I0226 19:57:15.145310 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:15 crc kubenswrapper[4722]: E0226 19:57:15.145372 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:15 crc kubenswrapper[4722]: E0226 19:57:15.145500 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:15 crc kubenswrapper[4722]: I0226 19:57:15.145807 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:15 crc kubenswrapper[4722]: E0226 19:57:15.145941 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:16 crc kubenswrapper[4722]: I0226 19:57:16.145605 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:16 crc kubenswrapper[4722]: E0226 19:57:16.145827 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:17 crc kubenswrapper[4722]: I0226 19:57:17.145783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:17 crc kubenswrapper[4722]: I0226 19:57:17.145825 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:17 crc kubenswrapper[4722]: E0226 19:57:17.145898 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:17 crc kubenswrapper[4722]: I0226 19:57:17.145783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:17 crc kubenswrapper[4722]: E0226 19:57:17.146046 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:17 crc kubenswrapper[4722]: E0226 19:57:17.146281 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:18 crc kubenswrapper[4722]: I0226 19:57:18.146512 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:18 crc kubenswrapper[4722]: E0226 19:57:18.146623 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:18 crc kubenswrapper[4722]: E0226 19:57:18.244457 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:57:19 crc kubenswrapper[4722]: I0226 19:57:19.145508 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:19 crc kubenswrapper[4722]: E0226 19:57:19.145915 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:19 crc kubenswrapper[4722]: I0226 19:57:19.145585 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:19 crc kubenswrapper[4722]: I0226 19:57:19.145513 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:19 crc kubenswrapper[4722]: E0226 19:57:19.146208 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:19 crc kubenswrapper[4722]: E0226 19:57:19.146405 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:20 crc kubenswrapper[4722]: I0226 19:57:20.145258 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:20 crc kubenswrapper[4722]: E0226 19:57:20.145450 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:21 crc kubenswrapper[4722]: I0226 19:57:21.145741 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:21 crc kubenswrapper[4722]: E0226 19:57:21.146365 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:21 crc kubenswrapper[4722]: I0226 19:57:21.145892 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:21 crc kubenswrapper[4722]: E0226 19:57:21.146595 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:21 crc kubenswrapper[4722]: I0226 19:57:21.145889 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:21 crc kubenswrapper[4722]: E0226 19:57:21.146766 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:22 crc kubenswrapper[4722]: I0226 19:57:22.146083 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:22 crc kubenswrapper[4722]: E0226 19:57:22.146314 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:23 crc kubenswrapper[4722]: I0226 19:57:23.145844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:23 crc kubenswrapper[4722]: E0226 19:57:23.146017 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:23 crc kubenswrapper[4722]: I0226 19:57:23.146049 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:23 crc kubenswrapper[4722]: E0226 19:57:23.146633 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:23 crc kubenswrapper[4722]: I0226 19:57:23.146846 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:23 crc kubenswrapper[4722]: E0226 19:57:23.147113 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:23 crc kubenswrapper[4722]: E0226 19:57:23.246051 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:57:24 crc kubenswrapper[4722]: I0226 19:57:24.145962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:24 crc kubenswrapper[4722]: E0226 19:57:24.146132 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:24 crc kubenswrapper[4722]: I0226 19:57:24.146967 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 19:57:24 crc kubenswrapper[4722]: I0226 19:57:24.914387 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/3.log" Feb 26 19:57:24 crc kubenswrapper[4722]: I0226 19:57:24.916845 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} Feb 26 19:57:24 crc kubenswrapper[4722]: I0226 19:57:24.918293 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:57:24 crc kubenswrapper[4722]: I0226 19:57:24.942530 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podStartSLOduration=137.942500812 podStartE2EDuration="2m17.942500812s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:24.940802724 +0000 UTC m=+187.477770658" watchObservedRunningTime="2026-02-26 19:57:24.942500812 +0000 UTC m=+187.479468776" Feb 26 19:57:25 crc kubenswrapper[4722]: I0226 19:57:25.053040 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vmrpg"] Feb 26 19:57:25 crc kubenswrapper[4722]: I0226 19:57:25.053157 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:25 crc kubenswrapper[4722]: E0226 19:57:25.053255 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:25 crc kubenswrapper[4722]: I0226 19:57:25.145124 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:25 crc kubenswrapper[4722]: I0226 19:57:25.145178 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:25 crc kubenswrapper[4722]: I0226 19:57:25.145210 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:25 crc kubenswrapper[4722]: E0226 19:57:25.145275 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:25 crc kubenswrapper[4722]: E0226 19:57:25.145455 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:25 crc kubenswrapper[4722]: E0226 19:57:25.145797 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:27 crc kubenswrapper[4722]: I0226 19:57:27.144965 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:27 crc kubenswrapper[4722]: E0226 19:57:27.145373 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:27 crc kubenswrapper[4722]: I0226 19:57:27.145043 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:27 crc kubenswrapper[4722]: I0226 19:57:27.145034 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:27 crc kubenswrapper[4722]: E0226 19:57:27.145451 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:27 crc kubenswrapper[4722]: I0226 19:57:27.145194 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:27 crc kubenswrapper[4722]: E0226 19:57:27.145587 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:27 crc kubenswrapper[4722]: E0226 19:57:27.145720 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.145462 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.145511 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.145514 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.145462 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.149584 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.149630 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.149584 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.151247 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.151652 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.153951 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.056061 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.093513 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.098266 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.098580 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffc6x"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.098993 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.099254 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j255s"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.099507 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.100650 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.101430 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.101970 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.102424 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.102764 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.104454 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.112201 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.113350 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.113536 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.113713 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.113830 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.114490 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.114667 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.114799 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.114906 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115017 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115141 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115250 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115435 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115457 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115535 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115602 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115711 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115834 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115841 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115933 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115967 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116097 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116221 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116346 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bzbtt"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116374 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116507 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116555 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116596 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116560 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116681 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116685 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117009 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117132 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117258 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117363 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117426 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117490 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117509 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.118199 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-sbl7q"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.118751 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.118923 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vn28h"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.119100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.122719 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.123778 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.134409 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.137811 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.139458 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.143276 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.143521 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.144568 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.144741 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.145096 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q4vhc"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.148614 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.150545 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.150762 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.151366 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.151671 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.151992 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.152275 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.152610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.152981 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.157970 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.160969 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8dztn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.162433 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.162970 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.163417 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.166988 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.167451 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-n77d2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.167662 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sbl9f"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.167912 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.168534 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.168779 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.172654 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fw46l"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.174466 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.174712 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175025 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175213 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175267 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175363 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175427 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175492 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175573 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175647 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175678 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175862 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175972 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.176100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.176205 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.176344 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.176449 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.176989 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.177028 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.177069 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.177205 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.177224 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.180411 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.180609 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.180739 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.180863 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.180977 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.181949 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.182745 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.182760 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.182863 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.182936 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.182975 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183012 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183051 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183081 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183125 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183207 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183247 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183210 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183556 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183632 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183683 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183705 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183847 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.184028 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bzbtt"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.185684 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.186122 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.186277 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.186487 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.186736 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.186948 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.187086 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.187670 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.190347 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.190798 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.194977 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lrsc8"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.201325 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.202631 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535596-sfmpl"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.203571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.204347 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.204718 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.205460 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j255s"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207217 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-client-ca\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207275 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nphm\" (UniqueName: \"kubernetes.io/projected/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-kube-api-access-8nphm\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-serving-cert\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207355 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d857l\" (UniqueName: \"kubernetes.io/projected/1382161f-eb97-4181-b983-7a6ca893b4e4-kube-api-access-d857l\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207373 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-serving-cert\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207398 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnvzc\" (UniqueName: \"kubernetes.io/projected/5a555014-34ab-4582-9cef-5d8ab49809c2-kube-api-access-rnvzc\") pod \"dns-operator-744455d44c-vn28h\" (UID: \"5a555014-34ab-4582-9cef-5d8ab49809c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207468 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-etcd-client\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qthz\" (UniqueName: \"kubernetes.io/projected/8bd819da-de96-4dc4-a893-2ae7b1be33b2-kube-api-access-9qthz\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207507 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-config\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207561 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-audit\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207582 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-encryption-config\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c55w\" (UniqueName: \"kubernetes.io/projected/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-kube-api-access-2c55w\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207618 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207668 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-image-import-ca\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207686 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-serving-cert\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207717 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1987ed24-91bb-4ba3-afb2-807c5a25de00-node-pullsecrets\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207738 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-ca\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdwr\" (UniqueName: \"kubernetes.io/projected/d5a9e6a6-79fe-454f-aec5-668c51bcc879-kube-api-access-wrdwr\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207812 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-serving-cert\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207832 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd819da-de96-4dc4-a893-2ae7b1be33b2-config\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207852 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffb7g\" (UniqueName: \"kubernetes.io/projected/b3b40efb-02fd-4bd1-9839-01755419392a-kube-api-access-ffb7g\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207915 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-audit-policies\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207932 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz99p\" (UniqueName: \"kubernetes.io/projected/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-kube-api-access-wz99p\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207951 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-service-ca-bundle\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207971 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9bzw\" (UniqueName: \"kubernetes.io/projected/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-kube-api-access-r9bzw\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a555014-34ab-4582-9cef-5d8ab49809c2-metrics-tls\") pod \"dns-operator-744455d44c-vn28h\" (UID: \"5a555014-34ab-4582-9cef-5d8ab49809c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a9e6a6-79fe-454f-aec5-668c51bcc879-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208079 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a9e6a6-79fe-454f-aec5-668c51bcc879-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208102 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bd819da-de96-4dc4-a893-2ae7b1be33b2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b3b40efb-02fd-4bd1-9839-01755419392a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208209 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1987ed24-91bb-4ba3-afb2-807c5a25de00-audit-dir\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208229 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlx5m\" (UniqueName: \"kubernetes.io/projected/ab76d410-2de1-47c9-a03c-be7a2b1fabab-kube-api-access-hlx5m\") pod \"downloads-7954f5f757-sbl7q\" (UID: \"ab76d410-2de1-47c9-a03c-be7a2b1fabab\") " pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208247 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3b40efb-02fd-4bd1-9839-01755419392a-serving-cert\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208265 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1382161f-eb97-4181-b983-7a6ca893b4e4-audit-dir\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208281 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-service-ca\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597fba49-4fb4-4060-af46-9b6fc47c89fc-serving-cert\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208326 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8bd819da-de96-4dc4-a893-2ae7b1be33b2-images\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208344 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-client\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208360 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-config\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208379 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-config\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208415 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208432 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mqfr\" (UniqueName: \"kubernetes.io/projected/1987ed24-91bb-4ba3-afb2-807c5a25de00-kube-api-access-5mqfr\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208447 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-etcd-client\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208491 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-auth-proxy-config\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208509 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-config\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208565 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-encryption-config\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208585 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v726n\" (UniqueName: \"kubernetes.io/projected/597fba49-4fb4-4060-af46-9b6fc47c89fc-kube-api-access-v726n\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208603 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208618 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-machine-approver-tls\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208635 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-config\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208650 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.217767 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.219274 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vpr4p"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.219321 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.219495 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.219634 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.219865 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.220311 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.223026 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.227785 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.231200 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.232072 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.232286 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.235775 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.236286 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.238166 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.238690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.240461 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8x8t7"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.241230 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.246551 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.248638 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-scs46"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.249082 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.249297 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.250224 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.252116 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.252606 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.264402 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-kwwbn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.264861 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6w5j6"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.265172 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.265647 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.265890 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.266233 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.266527 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.270006 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.270886 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-swt9q"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.271215 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.271318 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.271542 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.273666 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.273702 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffc6x"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.274802 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.275359 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.275413 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.277293 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-sbl7q"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.278321 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sbl9f"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.280087 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.280553 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-n77d2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.283032 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.284311 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.285340 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.299236 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.304233 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.307138 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4wdxv"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.308131 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309197 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bd819da-de96-4dc4-a893-2ae7b1be33b2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309227 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b3b40efb-02fd-4bd1-9839-01755419392a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309249 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1987ed24-91bb-4ba3-afb2-807c5a25de00-audit-dir\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309267 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlx5m\" (UniqueName: \"kubernetes.io/projected/ab76d410-2de1-47c9-a03c-be7a2b1fabab-kube-api-access-hlx5m\") pod \"downloads-7954f5f757-sbl7q\" (UID: \"ab76d410-2de1-47c9-a03c-be7a2b1fabab\") " pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309283 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3b40efb-02fd-4bd1-9839-01755419392a-serving-cert\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309301 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1382161f-eb97-4181-b983-7a6ca893b4e4-audit-dir\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309317 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-service-ca\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309332 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597fba49-4fb4-4060-af46-9b6fc47c89fc-serving-cert\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309346 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8bd819da-de96-4dc4-a893-2ae7b1be33b2-images\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309359 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-client\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309372 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-config\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309389 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-config\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309417 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309433 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mqfr\" (UniqueName: \"kubernetes.io/projected/1987ed24-91bb-4ba3-afb2-807c5a25de00-kube-api-access-5mqfr\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309448 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-etcd-client\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309486 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-auth-proxy-config\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309500 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-config\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309514 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-encryption-config\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309530 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v726n\" (UniqueName: \"kubernetes.io/projected/597fba49-4fb4-4060-af46-9b6fc47c89fc-kube-api-access-v726n\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309546 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309565 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmdls\" (UniqueName: \"kubernetes.io/projected/21b11897-db24-4d65-a438-d3695ccee5fc-kube-api-access-xmdls\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-machine-approver-tls\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309597 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-config\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309614 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-client-ca\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309654 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-config\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309671 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khjdl\" (UniqueName: \"kubernetes.io/projected/9a435401-5ccb-4811-bfd2-92826aa8fa63-kube-api-access-khjdl\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309688 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-serving-cert\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309701 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-oauth-config\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309718 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-oauth-serving-cert\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309735 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309751 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nphm\" (UniqueName: \"kubernetes.io/projected/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-kube-api-access-8nphm\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309769 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309787 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-serving-cert\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d857l\" (UniqueName: \"kubernetes.io/projected/1382161f-eb97-4181-b983-7a6ca893b4e4-kube-api-access-d857l\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309817 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-serving-cert\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309838 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnvzc\" (UniqueName: \"kubernetes.io/projected/5a555014-34ab-4582-9cef-5d8ab49809c2-kube-api-access-rnvzc\") pod \"dns-operator-744455d44c-vn28h\" (UID: \"5a555014-34ab-4582-9cef-5d8ab49809c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-etcd-client\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309894 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qthz\" (UniqueName: \"kubernetes.io/projected/8bd819da-de96-4dc4-a893-2ae7b1be33b2-kube-api-access-9qthz\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309917 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-config\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309936 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-audit\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309967 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-encryption-config\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c55w\" (UniqueName: \"kubernetes.io/projected/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-kube-api-access-2c55w\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310000 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310014 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-image-import-ca\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310029 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-serving-cert\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310045 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk7gj\" (UniqueName: \"kubernetes.io/projected/46842c31-3b12-4cbf-b722-327327cf8375-kube-api-access-bk7gj\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1987ed24-91bb-4ba3-afb2-807c5a25de00-node-pullsecrets\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-console-config\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310093 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-ca\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310110 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdwr\" (UniqueName: \"kubernetes.io/projected/d5a9e6a6-79fe-454f-aec5-668c51bcc879-kube-api-access-wrdwr\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310126 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-serving-cert\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310148 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd819da-de96-4dc4-a893-2ae7b1be33b2-config\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310179 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffb7g\" (UniqueName: \"kubernetes.io/projected/b3b40efb-02fd-4bd1-9839-01755419392a-kube-api-access-ffb7g\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310194 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a435401-5ccb-4811-bfd2-92826aa8fa63-serving-cert\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310210 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-audit-policies\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310226 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz99p\" (UniqueName: \"kubernetes.io/projected/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-kube-api-access-wz99p\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310241 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-service-ca-bundle\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310255 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-service-ca\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9bzw\" (UniqueName: \"kubernetes.io/projected/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-kube-api-access-r9bzw\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310296 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-client-ca\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a555014-34ab-4582-9cef-5d8ab49809c2-metrics-tls\") pod \"dns-operator-744455d44c-vn28h\" (UID: \"5a555014-34ab-4582-9cef-5d8ab49809c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310335 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a9e6a6-79fe-454f-aec5-668c51bcc879-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310352 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-trusted-ca-bundle\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310369 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a9e6a6-79fe-454f-aec5-668c51bcc879-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310387 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310496 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fw46l"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310958 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b3b40efb-02fd-4bd1-9839-01755419392a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.311042 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1987ed24-91bb-4ba3-afb2-807c5a25de00-audit-dir\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.311720 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-config\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.311839 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.312183 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-client-ca\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.314120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-config\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.314195 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1382161f-eb97-4181-b983-7a6ca893b4e4-audit-dir\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.314584 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-service-ca\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.315891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3b40efb-02fd-4bd1-9839-01755419392a-serving-cert\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.316228 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-ca\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.316421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.317250 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd819da-de96-4dc4-a893-2ae7b1be33b2-config\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.317372 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.317950 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.322531 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-service-ca-bundle\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.324018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-config\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.324050 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.324074 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a9e6a6-79fe-454f-aec5-668c51bcc879-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.324342 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-machine-approver-tls\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.324856 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-auth-proxy-config\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.327750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a555014-34ab-4582-9cef-5d8ab49809c2-metrics-tls\") pod \"dns-operator-744455d44c-vn28h\" (UID: \"5a555014-34ab-4582-9cef-5d8ab49809c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.328385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-audit\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.328952 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.330138 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8bd819da-de96-4dc4-a893-2ae7b1be33b2-images\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.330507 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-config\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.330565 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.331002 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-audit-policies\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.331021 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vn28h"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.331056 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1987ed24-91bb-4ba3-afb2-807c5a25de00-node-pullsecrets\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.331818 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-image-import-ca\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.332398 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-etcd-client\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.332683 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-etcd-client\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.332846 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8dztn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.332920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-encryption-config\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.333007 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.335272 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-encryption-config\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.335793 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.335963 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lrsc8"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.336827 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.337021 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-serving-cert\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.338498 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.341817 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.341865 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vpr4p"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.343245 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.343392 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8x8t7"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.343749 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597fba49-4fb4-4060-af46-9b6fc47c89fc-serving-cert\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.343838 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-config\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.345124 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bd819da-de96-4dc4-a893-2ae7b1be33b2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.345385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-serving-cert\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.345458 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q4vhc"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.347267 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.347767 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.348367 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-scs46"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.348828 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-client\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.349318 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-serving-cert\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.351511 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.351547 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bc7lz"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.351988 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-serving-cert\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.352177 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.355147 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wpgqc"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.356322 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.356566 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.359748 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.361029 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.361187 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.362483 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bc7lz"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.363557 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535596-sfmpl"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.363585 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a9e6a6-79fe-454f-aec5-668c51bcc879-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.367192 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6w5j6"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.367233 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-swt9q"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.367424 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wpgqc"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.370754 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4wdxv"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.370777 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.370788 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mrk8s"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.371345 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.372509 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.392944 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.403759 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.410904 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a435401-5ccb-4811-bfd2-92826aa8fa63-serving-cert\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.411106 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-service-ca\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.411236 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-client-ca\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.411339 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-trusted-ca-bundle\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.411506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412045 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmdls\" (UniqueName: \"kubernetes.io/projected/21b11897-db24-4d65-a438-d3695ccee5fc-kube-api-access-xmdls\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412158 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-serving-cert\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-oauth-config\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412404 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-config\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412580 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khjdl\" (UniqueName: \"kubernetes.io/projected/9a435401-5ccb-4811-bfd2-92826aa8fa63-kube-api-access-khjdl\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412679 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-oauth-serving-cert\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412785 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.413030 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk7gj\" (UniqueName: \"kubernetes.io/projected/46842c31-3b12-4cbf-b722-327327cf8375-kube-api-access-bk7gj\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.413123 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-console-config\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412965 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-service-ca\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.413088 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-trusted-ca-bundle\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.417396 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-console-config\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.419338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-serving-cert\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.420322 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-oauth-serving-cert\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.420835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-oauth-config\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.424359 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.443453 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.463856 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.484235 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.504110 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.523122 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.543539 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.563591 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.583223 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.603653 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.624546 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.663617 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.684208 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.703675 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.723399 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.744177 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.763954 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.783937 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.803492 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.815008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a435401-5ccb-4811-bfd2-92826aa8fa63-serving-cert\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.824385 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.835795 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-config\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.844083 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.865555 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.873636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-client-ca\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.890068 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.894705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.904468 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.926389 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.931388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.944076 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.971473 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.975100 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.984813 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.003520 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.024097 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.044297 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.065016 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.085114 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.105858 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.124321 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.144545 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.165002 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.184694 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.204558 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.224232 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.244816 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.261901 4722 request.go:700] Waited for 1.011452374s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/secrets?fieldSelector=metadata.name%3Dimage-registry-operator-tls&limit=500&resourceVersion=0 Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.265008 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.284734 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.304186 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.324399 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.344898 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.363877 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.383818 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.424161 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.445323 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.464180 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.484449 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.504402 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.524119 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.544908 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.563386 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.583946 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.604221 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.623085 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.643594 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.664567 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.682912 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.704433 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.725286 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.744856 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.763296 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.784743 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.804527 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.823926 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.843894 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.863691 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.883944 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.904190 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.923818 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.943578 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.963207 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.984634 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.017510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlx5m\" (UniqueName: \"kubernetes.io/projected/ab76d410-2de1-47c9-a03c-be7a2b1fabab-kube-api-access-hlx5m\") pod \"downloads-7954f5f757-sbl7q\" (UID: \"ab76d410-2de1-47c9-a03c-be7a2b1fabab\") " pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.037413 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v726n\" (UniqueName: \"kubernetes.io/projected/597fba49-4fb4-4060-af46-9b6fc47c89fc-kube-api-access-v726n\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.057704 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz99p\" (UniqueName: \"kubernetes.io/projected/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-kube-api-access-wz99p\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.077170 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdwr\" (UniqueName: \"kubernetes.io/projected/d5a9e6a6-79fe-454f-aec5-668c51bcc879-kube-api-access-wrdwr\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.079740 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.097912 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.100472 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffb7g\" (UniqueName: \"kubernetes.io/projected/b3b40efb-02fd-4bd1-9839-01755419392a-kube-api-access-ffb7g\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.103870 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.122098 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mqfr\" (UniqueName: \"kubernetes.io/projected/1987ed24-91bb-4ba3-afb2-807c5a25de00-kube-api-access-5mqfr\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.141847 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9bzw\" (UniqueName: \"kubernetes.io/projected/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-kube-api-access-r9bzw\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.158618 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d857l\" (UniqueName: \"kubernetes.io/projected/1382161f-eb97-4181-b983-7a6ca893b4e4-kube-api-access-d857l\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.170709 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.177504 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnvzc\" (UniqueName: \"kubernetes.io/projected/5a555014-34ab-4582-9cef-5d8ab49809c2-kube-api-access-rnvzc\") pod \"dns-operator-744455d44c-vn28h\" (UID: \"5a555014-34ab-4582-9cef-5d8ab49809c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.198924 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qthz\" (UniqueName: \"kubernetes.io/projected/8bd819da-de96-4dc4-a893-2ae7b1be33b2-kube-api-access-9qthz\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.221220 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c55w\" (UniqueName: \"kubernetes.io/projected/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-kube-api-access-2c55w\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.239302 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nphm\" (UniqueName: \"kubernetes.io/projected/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-kube-api-access-8nphm\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.244392 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.262377 4722 request.go:700] Waited for 1.909939363s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.264595 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.287851 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.287860 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.302906 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.304675 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.318973 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q4vhc"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.323395 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.323437 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.338337 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.343768 4722 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.345114 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.353969 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.363910 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.371661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.383681 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.387800 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.407568 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.424286 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.468996 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmdls\" (UniqueName: \"kubernetes.io/projected/21b11897-db24-4d65-a438-d3695ccee5fc-kube-api-access-xmdls\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.478885 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khjdl\" (UniqueName: \"kubernetes.io/projected/9a435401-5ccb-4811-bfd2-92826aa8fa63-kube-api-access-khjdl\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.507108 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk7gj\" (UniqueName: \"kubernetes.io/projected/46842c31-3b12-4cbf-b722-327327cf8375-kube-api-access-bk7gj\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.511417 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.531780 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.540924 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.540981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnnkd\" (UniqueName: \"kubernetes.io/projected/879f1fab-2121-4c06-87dc-c83e272e91c7-kube-api-access-hnnkd\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-policies\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-tls\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541129 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxbz\" (UniqueName: \"kubernetes.io/projected/c5f2964d-4206-4278-b5d2-e772e79ec1c9-kube-api-access-4dxbz\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqrpd\" (UniqueName: \"kubernetes.io/projected/fd936901-7dc0-416a-8ac6-8305c72d65ba-kube-api-access-nqrpd\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541223 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6zkh\" (UniqueName: \"kubernetes.io/projected/1e984e3c-44d1-497d-acca-bbfe76e7e283-kube-api-access-m6zkh\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541243 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541259 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541436 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-certificates\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541496 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54fc586b-a366-44ff-a10e-c561a9ebdd00-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541516 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f46c75d4-2d67-4537-a0ab-7622f406d085-trusted-ca\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541532 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38bc8665-24b9-47b9-b7d2-0e45f55a0112-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlsgc\" (UniqueName: \"kubernetes.io/projected/54fc586b-a366-44ff-a10e-c561a9ebdd00-kube-api-access-jlsgc\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541606 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541656 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541684 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/197397a2-75ee-4ddd-937d-3ee4d299252a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8ph8\" (UniqueName: \"kubernetes.io/projected/f46c75d4-2d67-4537-a0ab-7622f406d085-kube-api-access-b8ph8\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541752 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c857\" (UniqueName: \"kubernetes.io/projected/325ff868-2054-49be-be1c-971fc9411922-kube-api-access-6c857\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541777 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f46c75d4-2d67-4537-a0ab-7622f406d085-serving-cert\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541906 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d02d2f96-f341-476f-b9ce-c9cd482386f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d02d2f96-f341-476f-b9ce-c9cd482386f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f46c75d4-2d67-4537-a0ab-7622f406d085-config\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541983 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8l6\" (UniqueName: \"kubernetes.io/projected/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-kube-api-access-xl8l6\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542019 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fa0cb53-bdbe-4090-a508-b668e388ab57-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tx9d2\" (UID: \"3fa0cb53-bdbe-4090-a508-b668e388ab57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542056 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5f2964d-4206-4278-b5d2-e772e79ec1c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542082 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/108ac542-c708-437b-8538-9b20337835ce-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197397a2-75ee-4ddd-937d-3ee4d299252a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542168 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/325ff868-2054-49be-be1c-971fc9411922-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-trusted-ca\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542303 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f2964d-4206-4278-b5d2-e772e79ec1c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/879f1fab-2121-4c06-87dc-c83e272e91c7-webhook-cert\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542374 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-srv-cert\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542406 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njqxj\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-kube-api-access-njqxj\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542459 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e984e3c-44d1-497d-acca-bbfe76e7e283-images\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542477 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54fc586b-a366-44ff-a10e-c561a9ebdd00-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542491 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/08beba96-a728-482a-ba00-5a630ca65d01-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tvhm9\" (UID: \"08beba96-a728-482a-ba00-5a630ca65d01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f2964d-4206-4278-b5d2-e772e79ec1c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542588 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcbmp\" (UniqueName: \"kubernetes.io/projected/a13fa204-edf6-4e71-87c7-2a5d7603a100-kube-api-access-qcbmp\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13fa204-edf6-4e71-87c7-2a5d7603a100-secret-volume\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542785 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpwzb\" (UniqueName: \"kubernetes.io/projected/ff091d3e-230d-4911-9645-7de20d779b15-kube-api-access-wpwzb\") pod \"multus-admission-controller-857f4d67dd-8x8t7\" (UID: \"ff091d3e-230d-4911-9645-7de20d779b15\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542836 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-dir\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38bc8665-24b9-47b9-b7d2-0e45f55a0112-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542993 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/108ac542-c708-437b-8538-9b20337835ce-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543030 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/325ff868-2054-49be-be1c-971fc9411922-srv-cert\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543155 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197397a2-75ee-4ddd-937d-3ee4d299252a-config\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543177 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e984e3c-44d1-497d-acca-bbfe76e7e283-proxy-tls\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6vtc\" (UniqueName: \"kubernetes.io/projected/108ac542-c708-437b-8538-9b20337835ce-kube-api-access-j6vtc\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543316 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543364 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4dtd\" (UniqueName: \"kubernetes.io/projected/7c96e488-8450-4dff-ac4c-5ac9e210a9a6-kube-api-access-b4dtd\") pod \"auto-csr-approver-29535596-sfmpl\" (UID: \"7c96e488-8450-4dff-ac4c-5ac9e210a9a6\") " pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543420 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/108ac542-c708-437b-8538-9b20337835ce-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543444 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq4qp\" (UniqueName: \"kubernetes.io/projected/08beba96-a728-482a-ba00-5a630ca65d01-kube-api-access-zq4qp\") pod \"package-server-manager-789f6589d5-tvhm9\" (UID: \"08beba96-a728-482a-ba00-5a630ca65d01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543491 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e984e3c-44d1-497d-acca-bbfe76e7e283-auth-proxy-config\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-profile-collector-cert\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/879f1fab-2121-4c06-87dc-c83e272e91c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543608 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-bound-sa-token\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13fa204-edf6-4e71-87c7-2a5d7603a100-config-volume\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543647 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2pqd\" (UniqueName: \"kubernetes.io/projected/15c05814-e318-455c-83f7-40698b29a44d-kube-api-access-v2pqd\") pod \"control-plane-machine-set-operator-78cbb6b69f-dfrb6\" (UID: \"15c05814-e318-455c-83f7-40698b29a44d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543663 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff091d3e-230d-4911-9645-7de20d779b15-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8x8t7\" (UID: \"ff091d3e-230d-4911-9645-7de20d779b15\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543704 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02d2f96-f341-476f-b9ce-c9cd482386f1-config\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlkqg\" (UniqueName: \"kubernetes.io/projected/3fa0cb53-bdbe-4090-a508-b668e388ab57-kube-api-access-dlkqg\") pod \"cluster-samples-operator-665b6dd947-tx9d2\" (UID: \"3fa0cb53-bdbe-4090-a508-b668e388ab57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543864 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/879f1fab-2121-4c06-87dc-c83e272e91c7-tmpfs\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/15c05814-e318-455c-83f7-40698b29a44d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dfrb6\" (UID: \"15c05814-e318-455c-83f7-40698b29a44d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.553126 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-sbl7q"] Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.553794 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.053780235 +0000 UTC m=+196.590748159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.560840 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.604733 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:33 crc kubenswrapper[4722]: W0226 19:57:33.605704 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod597fba49_4fb4_4060_af46_9b6fc47c89fc.slice/crio-c6693039a26f1b3b8a9bf4f9668c75b740af7edf7702298a8aeea07ae2064704 WatchSource:0}: Error finding container c6693039a26f1b3b8a9bf4f9668c75b740af7edf7702298a8aeea07ae2064704: Status 404 returned error can't find the container with id c6693039a26f1b3b8a9bf4f9668c75b740af7edf7702298a8aeea07ae2064704 Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.620003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.645708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.646059 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.146018978 +0000 UTC m=+196.682986902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646216 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646270 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ecebccf3-47a9-4cba-a0ab-873ad1f18284-node-bootstrap-token\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646291 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248e6517-2010-41dc-9873-54109bf86b23-config\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646314 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g69dx\" (UniqueName: \"kubernetes.io/projected/730cba8e-b872-4ac3-a49c-57b789b21a3a-kube-api-access-g69dx\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-socket-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646948 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.648198 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.649303 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c3aef3b-8f94-47f3-8c12-e281c775f919-service-ca-bundle\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650490 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-stats-auth\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650511 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffc6x"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650531 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-plugins-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650613 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/197397a2-75ee-4ddd-937d-3ee4d299252a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650642 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ecebccf3-47a9-4cba-a0ab-873ad1f18284-certs\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650691 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c857\" (UniqueName: \"kubernetes.io/projected/325ff868-2054-49be-be1c-971fc9411922-kube-api-access-6c857\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8ph8\" (UniqueName: \"kubernetes.io/projected/f46c75d4-2d67-4537-a0ab-7622f406d085-kube-api-access-b8ph8\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650735 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmdbz\" (UniqueName: \"kubernetes.io/projected/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-kube-api-access-vmdbz\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650760 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d02d2f96-f341-476f-b9ce-c9cd482386f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650780 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d02d2f96-f341-476f-b9ce-c9cd482386f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f46c75d4-2d67-4537-a0ab-7622f406d085-config\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650822 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f46c75d4-2d67-4537-a0ab-7622f406d085-serving-cert\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650839 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7l5n\" (UniqueName: \"kubernetes.io/projected/ecebccf3-47a9-4cba-a0ab-873ad1f18284-kube-api-access-g7l5n\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-registration-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650876 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl8l6\" (UniqueName: \"kubernetes.io/projected/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-kube-api-access-xl8l6\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650893 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-proxy-tls\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650917 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fa0cb53-bdbe-4090-a508-b668e388ab57-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tx9d2\" (UID: \"3fa0cb53-bdbe-4090-a508-b668e388ab57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650935 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5f2964d-4206-4278-b5d2-e772e79ec1c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650954 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/108ac542-c708-437b-8538-9b20337835ce-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650972 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197397a2-75ee-4ddd-937d-3ee4d299252a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650989 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z2wg\" (UniqueName: \"kubernetes.io/projected/42451eee-951a-41bf-8873-e4ae65fe087a-kube-api-access-9z2wg\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95c9eee6-d445-441c-bd33-67606423203e-cert\") pod \"ingress-canary-bc7lz\" (UID: \"95c9eee6-d445-441c-bd33-67606423203e\") " pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651022 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f97kr\" (UniqueName: \"kubernetes.io/projected/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-kube-api-access-f97kr\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651053 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/325ff868-2054-49be-be1c-971fc9411922-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/730cba8e-b872-4ac3-a49c-57b789b21a3a-config-volume\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-trusted-ca\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f2964d-4206-4278-b5d2-e772e79ec1c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651357 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/879f1fab-2121-4c06-87dc-c83e272e91c7-webhook-cert\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651376 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651399 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwrm5\" (UniqueName: \"kubernetes.io/projected/248e6517-2010-41dc-9873-54109bf86b23-kube-api-access-rwrm5\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651416 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-srv-cert\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651432 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e984e3c-44d1-497d-acca-bbfe76e7e283-images\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651466 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njqxj\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-kube-api-access-njqxj\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54fc586b-a366-44ff-a10e-c561a9ebdd00-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651502 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/08beba96-a728-482a-ba00-5a630ca65d01-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tvhm9\" (UID: \"08beba96-a728-482a-ba00-5a630ca65d01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f2964d-4206-4278-b5d2-e772e79ec1c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651542 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcbmp\" (UniqueName: \"kubernetes.io/projected/a13fa204-edf6-4e71-87c7-2a5d7603a100-kube-api-access-qcbmp\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651558 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42451eee-951a-41bf-8873-e4ae65fe087a-signing-cabundle\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13fa204-edf6-4e71-87c7-2a5d7603a100-secret-volume\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651630 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dc6750-14fe-4188-b5aa-527a0e1b6377-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651665 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpwzb\" (UniqueName: \"kubernetes.io/projected/ff091d3e-230d-4911-9645-7de20d779b15-kube-api-access-wpwzb\") pod \"multus-admission-controller-857f4d67dd-8x8t7\" (UID: \"ff091d3e-230d-4911-9645-7de20d779b15\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651684 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-dir\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651701 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sv52\" (UniqueName: \"kubernetes.io/projected/2c3aef3b-8f94-47f3-8c12-e281c775f919-kube-api-access-2sv52\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651719 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38bc8665-24b9-47b9-b7d2-0e45f55a0112-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651737 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/108ac542-c708-437b-8538-9b20337835ce-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651754 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17dc6750-14fe-4188-b5aa-527a0e1b6377-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651785 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/325ff868-2054-49be-be1c-971fc9411922-srv-cert\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651819 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zps2w\" (UniqueName: \"kubernetes.io/projected/95c9eee6-d445-441c-bd33-67606423203e-kube-api-access-zps2w\") pod \"ingress-canary-bc7lz\" (UID: \"95c9eee6-d445-441c-bd33-67606423203e\") " pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651865 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197397a2-75ee-4ddd-937d-3ee4d299252a-config\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651884 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651909 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e984e3c-44d1-497d-acca-bbfe76e7e283-proxy-tls\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651930 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6vtc\" (UniqueName: \"kubernetes.io/projected/108ac542-c708-437b-8538-9b20337835ce-kube-api-access-j6vtc\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651951 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42451eee-951a-41bf-8873-e4ae65fe087a-signing-key\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651972 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651989 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/730cba8e-b872-4ac3-a49c-57b789b21a3a-metrics-tls\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652008 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4dtd\" (UniqueName: \"kubernetes.io/projected/7c96e488-8450-4dff-ac4c-5ac9e210a9a6-kube-api-access-b4dtd\") pod \"auto-csr-approver-29535596-sfmpl\" (UID: \"7c96e488-8450-4dff-ac4c-5ac9e210a9a6\") " pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/108ac542-c708-437b-8538-9b20337835ce-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652067 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652087 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq4qp\" (UniqueName: \"kubernetes.io/projected/08beba96-a728-482a-ba00-5a630ca65d01-kube-api-access-zq4qp\") pod \"package-server-manager-789f6589d5-tvhm9\" (UID: \"08beba96-a728-482a-ba00-5a630ca65d01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-profile-collector-cert\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652193 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e984e3c-44d1-497d-acca-bbfe76e7e283-auth-proxy-config\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652218 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-bound-sa-token\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652240 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13fa204-edf6-4e71-87c7-2a5d7603a100-config-volume\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652261 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2pqd\" (UniqueName: \"kubernetes.io/projected/15c05814-e318-455c-83f7-40698b29a44d-kube-api-access-v2pqd\") pod \"control-plane-machine-set-operator-78cbb6b69f-dfrb6\" (UID: \"15c05814-e318-455c-83f7-40698b29a44d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652279 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/879f1fab-2121-4c06-87dc-c83e272e91c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff091d3e-230d-4911-9645-7de20d779b15-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8x8t7\" (UID: \"ff091d3e-230d-4911-9645-7de20d779b15\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02d2f96-f341-476f-b9ce-c9cd482386f1-config\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652356 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlkqg\" (UniqueName: \"kubernetes.io/projected/3fa0cb53-bdbe-4090-a508-b668e388ab57-kube-api-access-dlkqg\") pod \"cluster-samples-operator-665b6dd947-tx9d2\" (UID: \"3fa0cb53-bdbe-4090-a508-b668e388ab57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652372 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-csi-data-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652392 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zplk\" (UniqueName: \"kubernetes.io/projected/f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a-kube-api-access-8zplk\") pod \"migrator-59844c95c7-bwfd2\" (UID: \"f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652413 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-mountpoint-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652437 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/879f1fab-2121-4c06-87dc-c83e272e91c7-tmpfs\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652475 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/15c05814-e318-455c-83f7-40698b29a44d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dfrb6\" (UID: \"15c05814-e318-455c-83f7-40698b29a44d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652555 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652580 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnnkd\" (UniqueName: \"kubernetes.io/projected/879f1fab-2121-4c06-87dc-c83e272e91c7-kube-api-access-hnnkd\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652605 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-policies\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652632 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-tls\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxbz\" (UniqueName: \"kubernetes.io/projected/c5f2964d-4206-4278-b5d2-e772e79ec1c9-kube-api-access-4dxbz\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652676 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-default-certificate\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652722 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqrpd\" (UniqueName: \"kubernetes.io/projected/fd936901-7dc0-416a-8ac6-8305c72d65ba-kube-api-access-nqrpd\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652746 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6zkh\" (UniqueName: \"kubernetes.io/projected/1e984e3c-44d1-497d-acca-bbfe76e7e283-kube-api-access-m6zkh\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652799 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652829 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652859 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-certificates\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652885 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54fc586b-a366-44ff-a10e-c561a9ebdd00-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652909 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17dc6750-14fe-4188-b5aa-527a0e1b6377-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652930 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-metrics-certs\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652956 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f46c75d4-2d67-4537-a0ab-7622f406d085-trusted-ca\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652978 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/248e6517-2010-41dc-9873-54109bf86b23-serving-cert\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.653003 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38bc8665-24b9-47b9-b7d2-0e45f55a0112-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.653025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlsgc\" (UniqueName: \"kubernetes.io/projected/54fc586b-a366-44ff-a10e-c561a9ebdd00-kube-api-access-jlsgc\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.653347 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-dir\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.653819 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/879f1fab-2121-4c06-87dc-c83e272e91c7-tmpfs\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.655723 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f46c75d4-2d67-4537-a0ab-7622f406d085-config\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.656427 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d02d2f96-f341-476f-b9ce-c9cd482386f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.656539 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.156520837 +0000 UTC m=+196.693488761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.656730 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.657046 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-policies\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.658068 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02d2f96-f341-476f-b9ce-c9cd482386f1-config\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.658889 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.660003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13fa204-edf6-4e71-87c7-2a5d7603a100-config-volume\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.661073 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/879f1fab-2121-4c06-87dc-c83e272e91c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.661114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff091d3e-230d-4911-9645-7de20d779b15-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8x8t7\" (UID: \"ff091d3e-230d-4911-9645-7de20d779b15\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.663954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e984e3c-44d1-497d-acca-bbfe76e7e283-auth-proxy-config\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.664717 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-certificates\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.665142 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/15c05814-e318-455c-83f7-40698b29a44d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dfrb6\" (UID: \"15c05814-e318-455c-83f7-40698b29a44d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.666013 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.666527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54fc586b-a366-44ff-a10e-c561a9ebdd00-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.667010 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e984e3c-44d1-497d-acca-bbfe76e7e283-proxy-tls\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.667266 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197397a2-75ee-4ddd-937d-3ee4d299252a-config\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.668801 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e984e3c-44d1-497d-acca-bbfe76e7e283-images\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.669388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f46c75d4-2d67-4537-a0ab-7622f406d085-trusted-ca\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.669510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38bc8665-24b9-47b9-b7d2-0e45f55a0112-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.670031 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-trusted-ca\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.670397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-srv-cert\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.671161 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.672480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-tls\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.672931 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-profile-collector-cert\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.673496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/325ff868-2054-49be-be1c-971fc9411922-srv-cert\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.673960 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/108ac542-c708-437b-8538-9b20337835ce-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.675042 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f2964d-4206-4278-b5d2-e772e79ec1c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.675545 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.675896 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.677051 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197397a2-75ee-4ddd-937d-3ee4d299252a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.677999 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/879f1fab-2121-4c06-87dc-c83e272e91c7-webhook-cert\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.678219 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38bc8665-24b9-47b9-b7d2-0e45f55a0112-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.678831 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.678977 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5f2964d-4206-4278-b5d2-e772e79ec1c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.679875 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/108ac542-c708-437b-8538-9b20337835ce-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.680475 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.680900 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f46c75d4-2d67-4537-a0ab-7622f406d085-serving-cert\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.681087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fa0cb53-bdbe-4090-a508-b668e388ab57-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tx9d2\" (UID: \"3fa0cb53-bdbe-4090-a508-b668e388ab57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.681647 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/08beba96-a728-482a-ba00-5a630ca65d01-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tvhm9\" (UID: \"08beba96-a728-482a-ba00-5a630ca65d01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.681878 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.681998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/325ff868-2054-49be-be1c-971fc9411922-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.684364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13fa204-edf6-4e71-87c7-2a5d7603a100-secret-volume\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.688245 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpwzb\" (UniqueName: \"kubernetes.io/projected/ff091d3e-230d-4911-9645-7de20d779b15-kube-api-access-wpwzb\") pod \"multus-admission-controller-857f4d67dd-8x8t7\" (UID: \"ff091d3e-230d-4911-9645-7de20d779b15\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.694787 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54fc586b-a366-44ff-a10e-c561a9ebdd00-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.703237 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/197397a2-75ee-4ddd-937d-3ee4d299252a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.725369 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c857\" (UniqueName: \"kubernetes.io/projected/325ff868-2054-49be-be1c-971fc9411922-kube-api-access-6c857\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.738755 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zps2w\" (UniqueName: \"kubernetes.io/projected/95c9eee6-d445-441c-bd33-67606423203e-kube-api-access-zps2w\") pod \"ingress-canary-bc7lz\" (UID: \"95c9eee6-d445-441c-bd33-67606423203e\") " pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758360 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42451eee-951a-41bf-8873-e4ae65fe087a-signing-key\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/730cba8e-b872-4ac3-a49c-57b789b21a3a-metrics-tls\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlsgc\" (UniqueName: \"kubernetes.io/projected/54fc586b-a366-44ff-a10e-c561a9ebdd00-kube-api-access-jlsgc\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-csi-data-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758503 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zplk\" (UniqueName: \"kubernetes.io/projected/f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a-kube-api-access-8zplk\") pod \"migrator-59844c95c7-bwfd2\" (UID: \"f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-csi-data-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-mountpoint-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.758690 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.258672973 +0000 UTC m=+196.795640897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-default-certificate\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758737 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-mountpoint-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758760 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17dc6750-14fe-4188-b5aa-527a0e1b6377-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758780 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-metrics-certs\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/248e6517-2010-41dc-9873-54109bf86b23-serving-cert\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759010 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ecebccf3-47a9-4cba-a0ab-873ad1f18284-node-bootstrap-token\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759031 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248e6517-2010-41dc-9873-54109bf86b23-config\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g69dx\" (UniqueName: \"kubernetes.io/projected/730cba8e-b872-4ac3-a49c-57b789b21a3a-kube-api-access-g69dx\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759093 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-socket-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759114 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c3aef3b-8f94-47f3-8c12-e281c775f919-service-ca-bundle\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759182 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-stats-auth\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759206 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-plugins-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ecebccf3-47a9-4cba-a0ab-873ad1f18284-certs\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759574 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmdbz\" (UniqueName: \"kubernetes.io/projected/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-kube-api-access-vmdbz\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759601 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7l5n\" (UniqueName: \"kubernetes.io/projected/ecebccf3-47a9-4cba-a0ab-873ad1f18284-kube-api-access-g7l5n\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759616 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-registration-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759638 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-proxy-tls\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z2wg\" (UniqueName: \"kubernetes.io/projected/42451eee-951a-41bf-8873-e4ae65fe087a-kube-api-access-9z2wg\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.760665 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248e6517-2010-41dc-9873-54109bf86b23-config\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.761195 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-plugins-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.761255 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.762172 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-registration-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.763386 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-socket-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.763412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17dc6750-14fe-4188-b5aa-527a0e1b6377-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.763990 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c3aef3b-8f94-47f3-8c12-e281c775f919-service-ca-bundle\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.765664 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-proxy-tls\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.766802 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95c9eee6-d445-441c-bd33-67606423203e-cert\") pod \"ingress-canary-bc7lz\" (UID: \"95c9eee6-d445-441c-bd33-67606423203e\") " pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.766836 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f97kr\" (UniqueName: \"kubernetes.io/projected/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-kube-api-access-f97kr\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.766872 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/730cba8e-b872-4ac3-a49c-57b789b21a3a-config-volume\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.766899 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwrm5\" (UniqueName: \"kubernetes.io/projected/248e6517-2010-41dc-9873-54109bf86b23-kube-api-access-rwrm5\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.766942 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42451eee-951a-41bf-8873-e4ae65fe087a-signing-cabundle\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.766981 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dc6750-14fe-4188-b5aa-527a0e1b6377-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.767011 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sv52\" (UniqueName: \"kubernetes.io/projected/2c3aef3b-8f94-47f3-8c12-e281c775f919-kube-api-access-2sv52\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.767038 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17dc6750-14fe-4188-b5aa-527a0e1b6377-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.768265 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ecebccf3-47a9-4cba-a0ab-873ad1f18284-certs\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.772503 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42451eee-951a-41bf-8873-e4ae65fe087a-signing-cabundle\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.772854 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-default-certificate\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.773480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/248e6517-2010-41dc-9873-54109bf86b23-serving-cert\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.773855 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42451eee-951a-41bf-8873-e4ae65fe087a-signing-key\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.774163 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/730cba8e-b872-4ac3-a49c-57b789b21a3a-config-volume\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.774305 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-stats-auth\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.774594 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ecebccf3-47a9-4cba-a0ab-873ad1f18284-node-bootstrap-token\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.775735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dc6750-14fe-4188-b5aa-527a0e1b6377-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.776335 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-metrics-certs\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.780191 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8ph8\" (UniqueName: \"kubernetes.io/projected/f46c75d4-2d67-4537-a0ab-7622f406d085-kube-api-access-b8ph8\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.781918 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95c9eee6-d445-441c-bd33-67606423203e-cert\") pod \"ingress-canary-bc7lz\" (UID: \"95c9eee6-d445-441c-bd33-67606423203e\") " pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.789018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlkqg\" (UniqueName: \"kubernetes.io/projected/3fa0cb53-bdbe-4090-a508-b668e388ab57-kube-api-access-dlkqg\") pod \"cluster-samples-operator-665b6dd947-tx9d2\" (UID: \"3fa0cb53-bdbe-4090-a508-b668e388ab57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.790256 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/730cba8e-b872-4ac3-a49c-57b789b21a3a-metrics-tls\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.792391 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vn28h"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.797439 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.822991 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bzbtt"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.825141 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4dtd\" (UniqueName: \"kubernetes.io/projected/7c96e488-8450-4dff-ac4c-5ac9e210a9a6-kube-api-access-b4dtd\") pod \"auto-csr-approver-29535596-sfmpl\" (UID: \"7c96e488-8450-4dff-ac4c-5ac9e210a9a6\") " pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.840706 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqrpd\" (UniqueName: \"kubernetes.io/projected/fd936901-7dc0-416a-8ac6-8305c72d65ba-kube-api-access-nqrpd\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.847703 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: W0226 19:57:33.863750 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bd819da_de96_4dc4_a893_2ae7b1be33b2.slice/crio-3b5649bf77dd4e658a5152ef9546435c46fcf596cac5fa64d50b97d091a71f38 WatchSource:0}: Error finding container 3b5649bf77dd4e658a5152ef9546435c46fcf596cac5fa64d50b97d091a71f38: Status 404 returned error can't find the container with id 3b5649bf77dd4e658a5152ef9546435c46fcf596cac5fa64d50b97d091a71f38 Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.868823 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.869529 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.870085 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.37007281 +0000 UTC m=+196.907040734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.877851 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j255s"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.878668 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.891830 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-n77d2"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.893240 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.894132 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6vtc\" (UniqueName: \"kubernetes.io/projected/108ac542-c708-437b-8538-9b20337835ce-kube-api-access-j6vtc\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.895083 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnnkd\" (UniqueName: \"kubernetes.io/projected/879f1fab-2121-4c06-87dc-c83e272e91c7-kube-api-access-hnnkd\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.903930 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2pqd\" (UniqueName: \"kubernetes.io/projected/15c05814-e318-455c-83f7-40698b29a44d-kube-api-access-v2pqd\") pod \"control-plane-machine-set-operator-78cbb6b69f-dfrb6\" (UID: \"15c05814-e318-455c-83f7-40698b29a44d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: W0226 19:57:33.925296 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ee5cc87_0769_444c_befc_7c1df0fb1fa3.slice/crio-ccce060381c89ea8efb71c9c212663adb04c89ea28b3e069be9b1c9d87bf99ff WatchSource:0}: Error finding container ccce060381c89ea8efb71c9c212663adb04c89ea28b3e069be9b1c9d87bf99ff: Status 404 returned error can't find the container with id ccce060381c89ea8efb71c9c212663adb04c89ea28b3e069be9b1c9d87bf99ff Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.926015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6zkh\" (UniqueName: \"kubernetes.io/projected/1e984e3c-44d1-497d-acca-bbfe76e7e283-kube-api-access-m6zkh\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.931369 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.940795 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lrsc8"] Feb 26 19:57:33 crc kubenswrapper[4722]: W0226 19:57:33.940954 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46842c31_3b12_4cbf_b722_327327cf8375.slice/crio-d676e23dbd02b3ce4c5e55cbc105fc4697d7335c5837c1d7914c22407cceb01b WatchSource:0}: Error finding container d676e23dbd02b3ce4c5e55cbc105fc4697d7335c5837c1d7914c22407cceb01b: Status 404 returned error can't find the container with id d676e23dbd02b3ce4c5e55cbc105fc4697d7335c5837c1d7914c22407cceb01b Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.941339 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.945109 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq4qp\" (UniqueName: \"kubernetes.io/projected/08beba96-a728-482a-ba00-5a630ca65d01-kube-api-access-zq4qp\") pod \"package-server-manager-789f6589d5-tvhm9\" (UID: \"08beba96-a728-482a-ba00-5a630ca65d01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.945523 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.953006 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.957590 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" event={"ID":"1382161f-eb97-4181-b983-7a6ca893b4e4","Type":"ContainerStarted","Data":"7db41b957050fc129ed0f7ac58136d594e78c3d1c88570df2c41137b2df788fe"} Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.964755 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" event={"ID":"b3b40efb-02fd-4bd1-9839-01755419392a","Type":"ContainerStarted","Data":"0366e10c65e8cc941116fc7b7e1c31e8272fad9a9e2fa1bb65591f9e3ee11b0f"} Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.967238 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.970223 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxbz\" (UniqueName: \"kubernetes.io/projected/c5f2964d-4206-4278-b5d2-e772e79ec1c9-kube-api-access-4dxbz\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.970984 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.971293 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.471246067 +0000 UTC m=+197.008213991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.971628 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.971954 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.471937047 +0000 UTC m=+197.008904971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.979383 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" event={"ID":"373e6a27-b86f-4e9d-a9eb-5b2837808dcd","Type":"ContainerStarted","Data":"592ecd0bfc60b510283896b3a1d7eda8f173484309e99f2dae33a1a6d16d812b"} Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.981125 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vpr4p"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.982572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" event={"ID":"597fba49-4fb4-4060-af46-9b6fc47c89fc","Type":"ContainerStarted","Data":"80741e000bfa05d4d2e412c24e24e95cbf05ce7b76d6a90d97465d4732bb06ea"} Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.982594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" event={"ID":"597fba49-4fb4-4060-af46-9b6fc47c89fc","Type":"ContainerStarted","Data":"c6693039a26f1b3b8a9bf4f9668c75b740af7edf7702298a8aeea07ae2064704"} Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.983226 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.984159 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-bound-sa-token\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.991337 4722 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ddcll container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.991387 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" podUID="597fba49-4fb4-4060-af46-9b6fc47c89fc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.008577 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/108ac542-c708-437b-8538-9b20337835ce-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.008869 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" event={"ID":"af1acacb-c369-4dae-8f27-1cdd6c94f8e7","Type":"ContainerStarted","Data":"65cfe80acf9818519fb081b065b6320f178e03d570eb47cf2504e2da75d46de5"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.008903 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" event={"ID":"af1acacb-c369-4dae-8f27-1cdd6c94f8e7","Type":"ContainerStarted","Data":"c8c1b21b58a89a772769978132c24e859470a2c99e3287321ea454e3f64b1fff"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.013393 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" event={"ID":"1987ed24-91bb-4ba3-afb2-807c5a25de00","Type":"ContainerStarted","Data":"ca6ceb9e0d2c89d9fc59fb1b468e8569bac42bd509d988ee308bba72a30005f2"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.014775 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" event={"ID":"d5a9e6a6-79fe-454f-aec5-668c51bcc879","Type":"ContainerStarted","Data":"23079f46f6cc4d23cf7413a01dd67a0aca0aa22afcacee4853daa244252d916d"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.016367 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" event={"ID":"8bd819da-de96-4dc4-a893-2ae7b1be33b2","Type":"ContainerStarted","Data":"3b5649bf77dd4e658a5152ef9546435c46fcf596cac5fa64d50b97d091a71f38"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.017489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n77d2" event={"ID":"46842c31-3b12-4cbf-b722-327327cf8375","Type":"ContainerStarted","Data":"d676e23dbd02b3ce4c5e55cbc105fc4697d7335c5837c1d7914c22407cceb01b"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.020595 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl8l6\" (UniqueName: \"kubernetes.io/projected/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-kube-api-access-xl8l6\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.021314 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" event={"ID":"5a555014-34ab-4582-9cef-5d8ab49809c2","Type":"ContainerStarted","Data":"2d25faf9524b04e01e231d61823b633f808244cadc520fd1a559ed8811ae9070"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.022585 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" event={"ID":"0ee5cc87-0769-444c-befc-7c1df0fb1fa3","Type":"ContainerStarted","Data":"ccce060381c89ea8efb71c9c212663adb04c89ea28b3e069be9b1c9d87bf99ff"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.025435 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-sbl7q" event={"ID":"ab76d410-2de1-47c9-a03c-be7a2b1fabab","Type":"ContainerStarted","Data":"7e3054b60e6ca6b0ad762c252f14f20863a32e600da9842f5b4c75dbbca18d5f"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.025468 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-sbl7q" event={"ID":"ab76d410-2de1-47c9-a03c-be7a2b1fabab","Type":"ContainerStarted","Data":"193da6c36c25e8b1819e9a17761a66c415962386bacdc53402b01562f8d53fb3"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.026405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:34 crc kubenswrapper[4722]: W0226 19:57:34.034676 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21b11897_db24_4d65_a438_d3695ccee5fc.slice/crio-9b148d8ca20afe21d57593040dc2d8cf41d9dc223fbeb9d749578f677863c31a WatchSource:0}: Error finding container 9b148d8ca20afe21d57593040dc2d8cf41d9dc223fbeb9d749578f677863c31a: Status 404 returned error can't find the container with id 9b148d8ca20afe21d57593040dc2d8cf41d9dc223fbeb9d749578f677863c31a Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.034752 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-sbl7q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.034798 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sbl7q" podUID="ab76d410-2de1-47c9-a03c-be7a2b1fabab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.044693 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d02d2f96-f341-476f-b9ce-c9cd482386f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.060599 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" event={"ID":"c48da9e0-253d-44c8-ad1c-6fc9e60e2431","Type":"ContainerStarted","Data":"a8ecebedeee56dd69286d389599cda0ab5e5e18e1c5e0674c23d995c11674324"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.060637 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" event={"ID":"c48da9e0-253d-44c8-ad1c-6fc9e60e2431","Type":"ContainerStarted","Data":"30a6b4301c9e74be6e2ef91a93989ea4e1b4dc879831a5e0a3177922807e4c82"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.062391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcbmp\" (UniqueName: \"kubernetes.io/projected/a13fa204-edf6-4e71-87c7-2a5d7603a100-kube-api-access-qcbmp\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.079797 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.081181 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.581125762 +0000 UTC m=+197.118093696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.082814 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.089043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f2964d-4206-4278-b5d2-e772e79ec1c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.093189 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.108095 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.114401 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njqxj\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-kube-api-access-njqxj\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.118987 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zplk\" (UniqueName: \"kubernetes.io/projected/f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a-kube-api-access-8zplk\") pod \"migrator-59844c95c7-bwfd2\" (UID: \"f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.119393 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sbl9f"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.155766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zps2w\" (UniqueName: \"kubernetes.io/projected/95c9eee6-d445-441c-bd33-67606423203e-kube-api-access-zps2w\") pod \"ingress-canary-bc7lz\" (UID: \"95c9eee6-d445-441c-bd33-67606423203e\") " pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.156939 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.166277 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.168824 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.177645 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmdbz\" (UniqueName: \"kubernetes.io/projected/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-kube-api-access-vmdbz\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.182401 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.183103 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.683087841 +0000 UTC m=+197.220055755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.187760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7l5n\" (UniqueName: \"kubernetes.io/projected/ecebccf3-47a9-4cba-a0ab-873ad1f18284-kube-api-access-g7l5n\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.214241 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.214458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.218983 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z2wg\" (UniqueName: \"kubernetes.io/projected/42451eee-951a-41bf-8873-e4ae65fe087a-kube-api-access-9z2wg\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.228774 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.232533 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g69dx\" (UniqueName: \"kubernetes.io/projected/730cba8e-b872-4ac3-a49c-57b789b21a3a-kube-api-access-g69dx\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.257694 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17dc6750-14fe-4188-b5aa-527a0e1b6377-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.260065 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.261315 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwrm5\" (UniqueName: \"kubernetes.io/projected/248e6517-2010-41dc-9873-54109bf86b23-kube-api-access-rwrm5\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.274269 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.286446 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.286919 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.786899934 +0000 UTC m=+197.323867858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.293593 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.302662 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sv52\" (UniqueName: \"kubernetes.io/projected/2c3aef3b-8f94-47f3-8c12-e281c775f919-kube-api-access-2sv52\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.305911 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f97kr\" (UniqueName: \"kubernetes.io/projected/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-kube-api-access-f97kr\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.306204 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.316241 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.322853 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.333239 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.340281 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.363736 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.364634 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.388584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.389462 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.88944736 +0000 UTC m=+197.426415284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.490899 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.491629 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.991319727 +0000 UTC m=+197.528287651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.531842 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535596-sfmpl"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.580191 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.593792 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.594760 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.094745548 +0000 UTC m=+197.631713472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.598967 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.599180 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.637528 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-scs46"] Feb 26 19:57:34 crc kubenswrapper[4722]: W0226 19:57:34.640019 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15c05814_e318_455c_83f7_40698b29a44d.slice/crio-67f19d71fcee76b3fc44a869e45afce39c1f3d94d0b02454a9eba8c391a549db WatchSource:0}: Error finding container 67f19d71fcee76b3fc44a869e45afce39c1f3d94d0b02454a9eba8c391a549db: Status 404 returned error can't find the container with id 67f19d71fcee76b3fc44a869e45afce39c1f3d94d0b02454a9eba8c391a549db Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.642922 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.650919 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.694768 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.695529 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.195512644 +0000 UTC m=+197.732480568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.698711 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8x8t7"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.717007 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.798196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.801443 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.301426286 +0000 UTC m=+197.838394210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.898993 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.899501 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.399483734 +0000 UTC m=+197.936451658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.948568 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8dztn"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.951081 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.000455 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.000981 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.500965 +0000 UTC m=+198.037932934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.010184 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.095569 4722 generic.go:334] "Generic (PLEG): container finished" podID="1987ed24-91bb-4ba3-afb2-807c5a25de00" containerID="915dc00c90c275fcfe37f778fb93bb0c4206cdf8af4d15800a41409c0325d869" exitCode=0 Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.095797 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" event={"ID":"1987ed24-91bb-4ba3-afb2-807c5a25de00","Type":"ContainerDied","Data":"915dc00c90c275fcfe37f778fb93bb0c4206cdf8af4d15800a41409c0325d869"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.097998 4722 generic.go:334] "Generic (PLEG): container finished" podID="1382161f-eb97-4181-b983-7a6ca893b4e4" containerID="9ac1541cf98bc1d7c0dcb6b9173750d92ce4438589d45b33d590b67794f41679" exitCode=0 Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.098086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" event={"ID":"1382161f-eb97-4181-b983-7a6ca893b4e4","Type":"ContainerDied","Data":"9ac1541cf98bc1d7c0dcb6b9173750d92ce4438589d45b33d590b67794f41679"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.101282 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.101569 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.601552891 +0000 UTC m=+198.138520805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.104586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" event={"ID":"373e6a27-b86f-4e9d-a9eb-5b2837808dcd","Type":"ContainerStarted","Data":"03a1be7562905188739316c7b573a9b996cff492350ceb7ed185c79c22534ae6"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.113945 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" event={"ID":"c5f2964d-4206-4278-b5d2-e772e79ec1c9","Type":"ContainerStarted","Data":"cf9fdc3ebbe84102d6788cb7f25d504589d7d42b7743c85bd77255a792add505"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.149421 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" event={"ID":"197397a2-75ee-4ddd-937d-3ee4d299252a","Type":"ContainerStarted","Data":"9082b7e7d61d8aba3bf9d8b7c4292389ae4e9a981517870539c2fb1fe2b9e8ba"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.156770 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" event={"ID":"fd936901-7dc0-416a-8ac6-8305c72d65ba","Type":"ContainerStarted","Data":"276f96c20b112e49a7e22df1751b734dd6c8d0b22d1debea8e9f0abd1d77f1fb"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.177463 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" event={"ID":"9a435401-5ccb-4811-bfd2-92826aa8fa63","Type":"ContainerStarted","Data":"5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.177507 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" event={"ID":"9a435401-5ccb-4811-bfd2-92826aa8fa63","Type":"ContainerStarted","Data":"7edb51afac751cd6bd9eeebb7fe8eca97e5c451376b3ba5cf7db2672829e5803"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.178027 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.187571 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-lrsc8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.187625 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" podUID="9a435401-5ccb-4811-bfd2-92826aa8fa63" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.191154 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" event={"ID":"325ff868-2054-49be-be1c-971fc9411922","Type":"ContainerStarted","Data":"7a963ffc91c972e6e356dc20824a37ac0478126a605b58e12acee239f193dcd6"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.191206 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" event={"ID":"325ff868-2054-49be-be1c-971fc9411922","Type":"ContainerStarted","Data":"237daa1b378933f3b116a4b6c857bc9cef02eb14cba750695780981676b784a5"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.194864 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.201642 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" event={"ID":"21b11897-db24-4d65-a438-d3695ccee5fc","Type":"ContainerStarted","Data":"12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.201685 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" event={"ID":"21b11897-db24-4d65-a438-d3695ccee5fc","Type":"ContainerStarted","Data":"9b148d8ca20afe21d57593040dc2d8cf41d9dc223fbeb9d749578f677863c31a"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.204304 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.204873 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.206069 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.706057832 +0000 UTC m=+198.243025756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.206361 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kwwbn" event={"ID":"2c3aef3b-8f94-47f3-8c12-e281c775f919","Type":"ContainerStarted","Data":"170caafa70195fa385c801fa6b78531225574b5336e24074bdf9e9b211405a3d"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.217481 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mrk8s" event={"ID":"ecebccf3-47a9-4cba-a0ab-873ad1f18284","Type":"ContainerStarted","Data":"a0022fc9d143d633ab84074eaf2d0f14697a648df56de6c19c8eb0624ba080e3"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.222036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n77d2" event={"ID":"46842c31-3b12-4cbf-b722-327327cf8375","Type":"ContainerStarted","Data":"4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.224808 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vpr4p container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.224861 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.224982 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dhg7f container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.225037 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" podUID="325ff868-2054-49be-be1c-971fc9411922" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.231634 4722 generic.go:334] "Generic (PLEG): container finished" podID="b3b40efb-02fd-4bd1-9839-01755419392a" containerID="8049ce86f240319aff5db63c10781fe4ad304468ea8c8d93c8a1c798da4ad52c" exitCode=0 Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.231715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" event={"ID":"b3b40efb-02fd-4bd1-9839-01755419392a","Type":"ContainerDied","Data":"8049ce86f240319aff5db63c10781fe4ad304468ea8c8d93c8a1c798da4ad52c"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.246691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" event={"ID":"5a555014-34ab-4582-9cef-5d8ab49809c2","Type":"ContainerStarted","Data":"227b8378609a3f67991d76779c42e31de3304ff3ab40e8bb3d0e4b707b88af90"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.262478 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" event={"ID":"1e984e3c-44d1-497d-acca-bbfe76e7e283","Type":"ContainerStarted","Data":"d04ddc93e657508bc37ba24b56f0d95a3cf736b46f4fa13987017e49df30c04e"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.267569 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" event={"ID":"c48da9e0-253d-44c8-ad1c-6fc9e60e2431","Type":"ContainerStarted","Data":"1fd212f95b2914119abbebeb7307e29fe8af4d2b3531c6dc864744d236152968"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.279476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" event={"ID":"54fc586b-a366-44ff-a10e-c561a9ebdd00","Type":"ContainerStarted","Data":"ba24dc72eaee3b5e6da806de5c6a867203ebe45d8c18bba69b4281c881135967"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.283746 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" event={"ID":"f46c75d4-2d67-4537-a0ab-7622f406d085","Type":"ContainerStarted","Data":"3f4a963c33bd2690906c9baf94047970eeb1d9946b83864516c3bf631ca8f12c"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.283778 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" event={"ID":"f46c75d4-2d67-4537-a0ab-7622f406d085","Type":"ContainerStarted","Data":"5947aec16b6a776305ed31518762a7cb34536161a30969115c4a68e614c81b7d"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.284461 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.288729 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" event={"ID":"d5a9e6a6-79fe-454f-aec5-668c51bcc879","Type":"ContainerStarted","Data":"cc65ea021c81e0449d57c58424aa8abc7106d8ce9014d8adcb84f048e2c63e83"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.297801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" event={"ID":"15c05814-e318-455c-83f7-40698b29a44d","Type":"ContainerStarted","Data":"67f19d71fcee76b3fc44a869e45afce39c1f3d94d0b02454a9eba8c391a549db"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.305811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.306183 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.80617076 +0000 UTC m=+198.343138684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.310235 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" event={"ID":"0ee5cc87-0769-444c-befc-7c1df0fb1fa3","Type":"ContainerStarted","Data":"3a956d5a0a90b560094b67ae258b004c675844751635e5b378a8ea90399e65a2"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.354349 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-sbl9f container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.354396 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" podUID="f46c75d4-2d67-4537-a0ab-7622f406d085" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.369616 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.378915 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.383443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" event={"ID":"8bd819da-de96-4dc4-a893-2ae7b1be33b2","Type":"ContainerStarted","Data":"261ee923f396b132fdf24b2da6ef4c074a41ce5c15719b0107b53f7c98bf1a48"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.386052 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" event={"ID":"ff091d3e-230d-4911-9645-7de20d779b15","Type":"ContainerStarted","Data":"17444bed5988afe706742b7073b782a0ba17db9405eb89b30005f562f2c2b61c"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.407399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.408955 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.908943432 +0000 UTC m=+198.445911356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.415077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" event={"ID":"7c96e488-8450-4dff-ac4c-5ac9e210a9a6","Type":"ContainerStarted","Data":"2f1e553263d89e01672f4f975fb65eb928586c467285d829440d70c715e53b87"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.415504 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-sbl7q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.415528 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sbl7q" podUID="ab76d410-2de1-47c9-a03c-be7a2b1fabab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.429401 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.441039 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.466697 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.479637 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-swt9q"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.493493 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" podStartSLOduration=148.493475616 podStartE2EDuration="2m28.493475616s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:35.479246842 +0000 UTC m=+198.016214786" watchObservedRunningTime="2026-02-26 19:57:35.493475616 +0000 UTC m=+198.030443540" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.495668 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6w5j6"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.500400 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.507927 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.509483 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.509927 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.009912884 +0000 UTC m=+198.546880808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.515671 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9"] Feb 26 19:57:35 crc kubenswrapper[4722]: W0226 19:57:35.541503 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42451eee_951a_41bf_8873_e4ae65fe087a.slice/crio-829cae7c116f1b39273c2ca3d6c5519cd6425ffe15fd4f79890ec42ea8f416d9 WatchSource:0}: Error finding container 829cae7c116f1b39273c2ca3d6c5519cd6425ffe15fd4f79890ec42ea8f416d9: Status 404 returned error can't find the container with id 829cae7c116f1b39273c2ca3d6c5519cd6425ffe15fd4f79890ec42ea8f416d9 Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.568707 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" podStartSLOduration=148.568688455 podStartE2EDuration="2m28.568688455s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:35.551080964 +0000 UTC m=+198.088048888" watchObservedRunningTime="2026-02-26 19:57:35.568688455 +0000 UTC m=+198.105656369" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.611759 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.612205 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.112189782 +0000 UTC m=+198.649157706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.637288 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bc7lz"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.657372 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4wdxv"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.700820 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wpgqc"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.714358 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.714467 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.21445205 +0000 UTC m=+198.751419974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.714521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.714830 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.214815351 +0000 UTC m=+198.751783275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: W0226 19:57:35.728163 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod730cba8e_b872_4ac3_a49c_57b789b21a3a.slice/crio-eed5dcf810a99b612afda7d4f737e6815647cdcf7be05754b4717f26735fc1c5 WatchSource:0}: Error finding container eed5dcf810a99b612afda7d4f737e6815647cdcf7be05754b4717f26735fc1c5: Status 404 returned error can't find the container with id eed5dcf810a99b612afda7d4f737e6815647cdcf7be05754b4717f26735fc1c5 Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.772999 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39140: no serving certificate available for the kubelet" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.796070 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-sbl7q" podStartSLOduration=148.796048241 podStartE2EDuration="2m28.796048241s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:35.755217169 +0000 UTC m=+198.292185103" watchObservedRunningTime="2026-02-26 19:57:35.796048241 +0000 UTC m=+198.333016175" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.820812 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.821873 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.321859264 +0000 UTC m=+198.858827188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.824407 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.875110 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39142: no serving certificate available for the kubelet" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.876327 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" podStartSLOduration=148.876307152 podStartE2EDuration="2m28.876307152s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:35.873651278 +0000 UTC m=+198.410619222" watchObservedRunningTime="2026-02-26 19:57:35.876307152 +0000 UTC m=+198.413275096" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.923552 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.923920 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.423909516 +0000 UTC m=+198.960877440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.974587 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39158: no serving certificate available for the kubelet" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.003191 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" podStartSLOduration=149.00313417 podStartE2EDuration="2m29.00313417s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.002627785 +0000 UTC m=+198.539595729" watchObservedRunningTime="2026-02-26 19:57:36.00313417 +0000 UTC m=+198.540102094" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.025693 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.026030 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.52601432 +0000 UTC m=+199.062982244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.051966 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" podStartSLOduration=149.051949467 podStartE2EDuration="2m29.051949467s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.050529108 +0000 UTC m=+198.587497042" watchObservedRunningTime="2026-02-26 19:57:36.051949467 +0000 UTC m=+198.588917391" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.078515 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39170: no serving certificate available for the kubelet" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.124314 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" podStartSLOduration=149.124287565 podStartE2EDuration="2m29.124287565s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.108138825 +0000 UTC m=+198.645106759" watchObservedRunningTime="2026-02-26 19:57:36.124287565 +0000 UTC m=+198.661255499" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.131110 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.133685 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.633665901 +0000 UTC m=+199.170633825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.143045 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" podStartSLOduration=149.143032288 podStartE2EDuration="2m29.143032288s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.141376211 +0000 UTC m=+198.678344145" watchObservedRunningTime="2026-02-26 19:57:36.143032288 +0000 UTC m=+198.680000212" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.177586 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39174: no serving certificate available for the kubelet" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.200783 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" podStartSLOduration=149.200761969 podStartE2EDuration="2m29.200761969s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.162501742 +0000 UTC m=+198.699469656" watchObservedRunningTime="2026-02-26 19:57:36.200761969 +0000 UTC m=+198.737729913" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.235277 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" podStartSLOduration=149.235259971 podStartE2EDuration="2m29.235259971s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.232194973 +0000 UTC m=+198.769162907" watchObservedRunningTime="2026-02-26 19:57:36.235259971 +0000 UTC m=+198.772227905" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.235351 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.235521 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.735505538 +0000 UTC m=+199.272473462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.237611 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.237985 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.737972308 +0000 UTC m=+199.274940232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.275937 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39178: no serving certificate available for the kubelet" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.291971 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" podStartSLOduration=149.291947893 podStartE2EDuration="2m29.291947893s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.282414031 +0000 UTC m=+198.819381965" watchObservedRunningTime="2026-02-26 19:57:36.291947893 +0000 UTC m=+198.828915817" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.325258 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" podStartSLOduration=149.32524096 podStartE2EDuration="2m29.32524096s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.322773449 +0000 UTC m=+198.859741383" watchObservedRunningTime="2026-02-26 19:57:36.32524096 +0000 UTC m=+198.862208884" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.338339 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.338721 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.838704123 +0000 UTC m=+199.375672047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.377856 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-n77d2" podStartSLOduration=149.377833925 podStartE2EDuration="2m29.377833925s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.374570352 +0000 UTC m=+198.911538276" watchObservedRunningTime="2026-02-26 19:57:36.377833925 +0000 UTC m=+198.914801849" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.399609 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39184: no serving certificate available for the kubelet" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.402958 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lrsc8"] Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.421897 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" podStartSLOduration=149.421880167 podStartE2EDuration="2m29.421880167s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.418338117 +0000 UTC m=+198.955306051" watchObservedRunningTime="2026-02-26 19:57:36.421880167 +0000 UTC m=+198.958848091" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.445272 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.445595 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.945582942 +0000 UTC m=+199.482550856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.488328 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" event={"ID":"108ac542-c708-437b-8538-9b20337835ce","Type":"ContainerStarted","Data":"69ed9ccc61f70ffb466ce53b4adcebdd82293d562eb20526f035f1c1b10fd6e2"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.508154 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bc7lz" event={"ID":"95c9eee6-d445-441c-bd33-67606423203e","Type":"ContainerStarted","Data":"4e295a469ebb358254722579cefee9c11b34215094775754a4506fa2dde5d10a"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.513714 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll"] Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.526622 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39188: no serving certificate available for the kubelet" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.548775 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.549224 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.049208019 +0000 UTC m=+199.586175943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.549795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" event={"ID":"197397a2-75ee-4ddd-937d-3ee4d299252a","Type":"ContainerStarted","Data":"49ebc7606c80c5b8b4e01ac641f9d59b457994796ee26576c72def7407778a36"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.563671 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" event={"ID":"a13fa204-edf6-4e71-87c7-2a5d7603a100","Type":"ContainerStarted","Data":"88c213f62e12dbb0dd1f6360f1a6e19c1f15f5006140bee25ff8068b5724daf6"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.563714 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" event={"ID":"a13fa204-edf6-4e71-87c7-2a5d7603a100","Type":"ContainerStarted","Data":"90f5c07c38e02227ba00789927ef16c1d77638f6e991d8dab7ffc70b8d28b552"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.571683 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" podStartSLOduration=149.571470372 podStartE2EDuration="2m29.571470372s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.568696453 +0000 UTC m=+199.105664377" watchObservedRunningTime="2026-02-26 19:57:36.571470372 +0000 UTC m=+199.108438296" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.605032 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" podStartSLOduration=149.605016196 podStartE2EDuration="2m29.605016196s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.60339668 +0000 UTC m=+199.140364604" watchObservedRunningTime="2026-02-26 19:57:36.605016196 +0000 UTC m=+199.141984120" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.628354 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" event={"ID":"15c05814-e318-455c-83f7-40698b29a44d","Type":"ContainerStarted","Data":"ed09f6a09ba67e82f0823d943d5c398671567eea2c2b4cccc439f9f96f0046e8"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.647034 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" event={"ID":"1e984e3c-44d1-497d-acca-bbfe76e7e283","Type":"ContainerStarted","Data":"a25fd9ed2957d3a99e09ad9ed613398d11f56d38dd1b83c3eefbd00a4606f2ab"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.647079 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" event={"ID":"1e984e3c-44d1-497d-acca-bbfe76e7e283","Type":"ContainerStarted","Data":"7272ae57de4209c32db808e84998a80713f4acff50b00df5ef04577c915533b8"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.650804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.653807 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.153794433 +0000 UTC m=+199.690762357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.686340 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" event={"ID":"d02d2f96-f341-476f-b9ce-c9cd482386f1","Type":"ContainerStarted","Data":"fcd78368df410589b070fb78a1745a45b1c8a3639ca2e51fd2a93505f515100a"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.705520 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" event={"ID":"08beba96-a728-482a-ba00-5a630ca65d01","Type":"ContainerStarted","Data":"8beb85e054b102153d6a6d865c818ff5c38632a695d269e7856e47920f9c07c6"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.753196 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" podStartSLOduration=149.753176999 podStartE2EDuration="2m29.753176999s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.705697639 +0000 UTC m=+199.242665563" watchObservedRunningTime="2026-02-26 19:57:36.753176999 +0000 UTC m=+199.290144943" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.753390 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.753696 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.253675444 +0000 UTC m=+199.790643408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.753688 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" podStartSLOduration=149.753679914 podStartE2EDuration="2m29.753679914s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.753093566 +0000 UTC m=+199.290061500" watchObservedRunningTime="2026-02-26 19:57:36.753679914 +0000 UTC m=+199.290647838" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.765161 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" event={"ID":"b3b40efb-02fd-4bd1-9839-01755419392a","Type":"ContainerStarted","Data":"9be156a6adba218ec30aec5cf7085227d8cb0329f7504390372a15708b2ee4d6"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.765863 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.788052 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" event={"ID":"54fc586b-a366-44ff-a10e-c561a9ebdd00","Type":"ContainerStarted","Data":"84ac43d8c46681c06466460e047be2614118953681b84e9845fb3ced283b1244"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.804465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mrk8s" event={"ID":"ecebccf3-47a9-4cba-a0ab-873ad1f18284","Type":"ContainerStarted","Data":"ff580bff9e1a989f525afe70dc26bccaea52324ab508fccaa3f1677e63208a95"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.817185 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" event={"ID":"42451eee-951a-41bf-8873-e4ae65fe087a","Type":"ContainerStarted","Data":"b9284b3d69750bce55fb6c831c22d8eb2ccbe07275b1a779c0e5e281bbf81a1c"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.817230 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" event={"ID":"42451eee-951a-41bf-8873-e4ae65fe087a","Type":"ContainerStarted","Data":"829cae7c116f1b39273c2ca3d6c5519cd6425ffe15fd4f79890ec42ea8f416d9"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.832603 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" event={"ID":"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4","Type":"ContainerStarted","Data":"8e749465749b3f395ada0c336adfbb38a290ccd3e7799e1441e4423b2ad1a43c"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.837434 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" podStartSLOduration=149.837414995 podStartE2EDuration="2m29.837414995s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.789174692 +0000 UTC m=+199.326142636" watchObservedRunningTime="2026-02-26 19:57:36.837414995 +0000 UTC m=+199.374382929" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.837859 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mrk8s" podStartSLOduration=5.837855337 podStartE2EDuration="5.837855337s" podCreationTimestamp="2026-02-26 19:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.836430186 +0000 UTC m=+199.373398120" watchObservedRunningTime="2026-02-26 19:57:36.837855337 +0000 UTC m=+199.374823261" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.855373 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kwwbn" event={"ID":"2c3aef3b-8f94-47f3-8c12-e281c775f919","Type":"ContainerStarted","Data":"0121c293cd1adf199e3dec3372fa8265cf853a947e2aed055faba2b6a9d84be9"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.855584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.855850 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.355808528 +0000 UTC m=+199.892776452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.869394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" event={"ID":"17dc6750-14fe-4188-b5aa-527a0e1b6377","Type":"ContainerStarted","Data":"fb3ca7e70c4e85b1cffea6a0993ae3dbabdb68704ec547b2bb2fcb67804e1a8c"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.869777 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" event={"ID":"17dc6750-14fe-4188-b5aa-527a0e1b6377","Type":"ContainerStarted","Data":"6bddc1fef6cbc4a78bbf7e6c78700c7f3b969eabec6d92849c443691351a3a28"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.880594 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" podStartSLOduration=149.880577842 podStartE2EDuration="2m29.880577842s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.868674044 +0000 UTC m=+199.405641968" watchObservedRunningTime="2026-02-26 19:57:36.880577842 +0000 UTC m=+199.417545766" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.903997 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-kwwbn" podStartSLOduration=149.903972987 podStartE2EDuration="2m29.903972987s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.889257729 +0000 UTC m=+199.426225653" watchObservedRunningTime="2026-02-26 19:57:36.903972987 +0000 UTC m=+199.440940931" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.912262 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" event={"ID":"879f1fab-2121-4c06-87dc-c83e272e91c7","Type":"ContainerStarted","Data":"bd35fd8197537ce202afd5a306c74d59bcd1efb5dd0b099a0efb26c385d6f685"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.912310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" event={"ID":"879f1fab-2121-4c06-87dc-c83e272e91c7","Type":"ContainerStarted","Data":"2062dae3e4eb7998400dd7cd1adf5f28307abf30f6e0a7ff48a50046c9e445e2"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.913084 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.925446 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-k47rx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.925517 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" podUID="879f1fab-2121-4c06-87dc-c83e272e91c7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.926926 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" podStartSLOduration=149.926907489 podStartE2EDuration="2m29.926907489s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.926120307 +0000 UTC m=+199.463088221" watchObservedRunningTime="2026-02-26 19:57:36.926907489 +0000 UTC m=+199.463875413" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.927619 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" event={"ID":"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f","Type":"ContainerStarted","Data":"878c387f2fad00a8e221f9a51e0c2f9bbfe58fae33af30a7c6301ab0bdc9faa8"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.927724 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" event={"ID":"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f","Type":"ContainerStarted","Data":"9eae57cbd284fb0c9545c78631c6aac7e124516e8e1b03792e6167aab4b8f169"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.929773 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.930540 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-nhcjc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.930681 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" podUID="ad1102e8-2b9d-47ea-8c17-4a304c7ee62f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.937936 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" event={"ID":"248e6517-2010-41dc-9873-54109bf86b23","Type":"ContainerStarted","Data":"65169ef17fe8ed3b8d6e2cc7f74dff56a0590c382be4ef3de0bade5ef989cd40"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.937983 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" event={"ID":"248e6517-2010-41dc-9873-54109bf86b23","Type":"ContainerStarted","Data":"0cf0a940a91d24701bef241993fdeb78d071a983855029d411cc82f24978dd17"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.955899 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" podStartSLOduration=149.955880174 podStartE2EDuration="2m29.955880174s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.943657755 +0000 UTC m=+199.480625689" watchObservedRunningTime="2026-02-26 19:57:36.955880174 +0000 UTC m=+199.492848098" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.957038 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.957492 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.457465388 +0000 UTC m=+199.994433312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.965655 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" podStartSLOduration=149.965637321 podStartE2EDuration="2m29.965637321s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.963446809 +0000 UTC m=+199.500414733" watchObservedRunningTime="2026-02-26 19:57:36.965637321 +0000 UTC m=+199.502605245" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.970859 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" event={"ID":"1382161f-eb97-4181-b983-7a6ca893b4e4","Type":"ContainerStarted","Data":"44c67a1b302ca2aaebaf3ed1afb17be2212ac074e36b5ddf17ce348f461e0ec4"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.989879 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" event={"ID":"3fa0cb53-bdbe-4090-a508-b668e388ab57","Type":"ContainerStarted","Data":"3e2cbf22ba7a3f10e066aa3a34edb5b70188b2047d16782835123a91389c4de4"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.989918 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" event={"ID":"3fa0cb53-bdbe-4090-a508-b668e388ab57","Type":"ContainerStarted","Data":"2851f1ab71d687edfaff8cf263ac5627ec69d38db31613f5e5df664f7ebc13a8"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.993744 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" podStartSLOduration=149.993724099 podStartE2EDuration="2m29.993724099s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.981527633 +0000 UTC m=+199.518495557" watchObservedRunningTime="2026-02-26 19:57:36.993724099 +0000 UTC m=+199.530692023" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.007488 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" podStartSLOduration=150.00747347 podStartE2EDuration="2m30.00747347s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.006724349 +0000 UTC m=+199.543692283" watchObservedRunningTime="2026-02-26 19:57:37.00747347 +0000 UTC m=+199.544441394" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.033808 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4wdxv" event={"ID":"730cba8e-b872-4ac3-a49c-57b789b21a3a","Type":"ContainerStarted","Data":"eed5dcf810a99b612afda7d4f737e6815647cdcf7be05754b4717f26735fc1c5"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.058761 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.060000 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.559983674 +0000 UTC m=+200.096951598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.065577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" event={"ID":"ff091d3e-230d-4911-9645-7de20d779b15","Type":"ContainerStarted","Data":"1bb5124ff4ea64d0a00b2c0cc88522a10796d61774317f26aa8d09c11fba0d48"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.103370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" event={"ID":"f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a","Type":"ContainerStarted","Data":"984bc09cb352fe66e3e874b9de45a81c2172bfb586f82b440d13b1c47395b65a"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.117308 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" event={"ID":"5a555014-34ab-4582-9cef-5d8ab49809c2","Type":"ContainerStarted","Data":"e248c9139ccc9bad842724993bcb11ae76afa5680a17669955c62b0e7b3d798a"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.121976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" event={"ID":"1987ed24-91bb-4ba3-afb2-807c5a25de00","Type":"ContainerStarted","Data":"735e68392467c7f33496a2ef8663b2b20af01b2c4d3c5056df991d135d19e83b"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.132894 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" event={"ID":"c5f2964d-4206-4278-b5d2-e772e79ec1c9","Type":"ContainerStarted","Data":"8ac3d30ab46fb2843abb841042fb4b8b39de30e97512a8c3b73c1aa679e3db77"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.138718 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" podStartSLOduration=150.138702382 podStartE2EDuration="2m30.138702382s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.051560734 +0000 UTC m=+199.588528658" watchObservedRunningTime="2026-02-26 19:57:37.138702382 +0000 UTC m=+199.675670306" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.139970 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" podStartSLOduration=150.139965169 podStartE2EDuration="2m30.139965169s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.137953751 +0000 UTC m=+199.674921675" watchObservedRunningTime="2026-02-26 19:57:37.139965169 +0000 UTC m=+199.676933093" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.150233 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" event={"ID":"fd936901-7dc0-416a-8ac6-8305c72d65ba","Type":"ContainerStarted","Data":"742b5c5ffe257d1d9783d658dc3b6b1076163264902ddffa577e2b0751bf51f0"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.150290 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.163316 4722 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8dztn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" start-of-body= Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.163414 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" podUID="fd936901-7dc0-416a-8ac6-8305c72d65ba" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.164322 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" podStartSLOduration=150.16430468 podStartE2EDuration="2m30.16430468s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.163196129 +0000 UTC m=+199.700164053" watchObservedRunningTime="2026-02-26 19:57:37.16430468 +0000 UTC m=+199.701272594" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.166054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.167481 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.66746365 +0000 UTC m=+200.204431574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.171505 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" event={"ID":"8bd819da-de96-4dc4-a893-2ae7b1be33b2","Type":"ContainerStarted","Data":"789ef5cf49001f08f051f715a4dee8c0c62b46660c4d775bb2cbd47c5e814a1d"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.173970 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" event={"ID":"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56","Type":"ContainerStarted","Data":"038c1f9be81892cbf4fe1e32ec60edc125a2d5ed5ea7a4098558590c2040bb53"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.179241 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-sbl7q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.179281 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sbl7q" podUID="ab76d410-2de1-47c9-a03c-be7a2b1fabab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.190340 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.195466 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.195579 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.203895 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" podStartSLOduration=150.203882546 podStartE2EDuration="2m30.203882546s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.201627462 +0000 UTC m=+199.738595406" watchObservedRunningTime="2026-02-26 19:57:37.203882546 +0000 UTC m=+199.740850470" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.223529 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39200: no serving certificate available for the kubelet" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.239383 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" podStartSLOduration=150.239365415 podStartE2EDuration="2m30.239365415s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.237474061 +0000 UTC m=+199.774442015" watchObservedRunningTime="2026-02-26 19:57:37.239365415 +0000 UTC m=+199.776333339" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.269117 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.278422 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.778408195 +0000 UTC m=+200.315376119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.373603 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.373974 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.873956803 +0000 UTC m=+200.410924727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.474786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.475106 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.975094599 +0000 UTC m=+200.512062523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.575965 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.576123 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.076099271 +0000 UTC m=+200.613067195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.576258 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.576586 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.076579434 +0000 UTC m=+200.613547358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.585495 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.602086 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:37 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:37 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:37 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.602160 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.621659 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" podStartSLOduration=150.621643256 podStartE2EDuration="2m30.621643256s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.580124576 +0000 UTC m=+200.117092500" watchObservedRunningTime="2026-02-26 19:57:37.621643256 +0000 UTC m=+200.158611180" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.653921 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" podStartSLOduration=150.653904414 podStartE2EDuration="2m30.653904414s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.652541176 +0000 UTC m=+200.189509110" watchObservedRunningTime="2026-02-26 19:57:37.653904414 +0000 UTC m=+200.190872338" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.686668 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.686973 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.186957694 +0000 UTC m=+200.723925618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.788362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.788857 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.288839581 +0000 UTC m=+200.825807505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.889729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.889881 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.389856974 +0000 UTC m=+200.926824898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.889959 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.890403 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.390391109 +0000 UTC m=+200.927359033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.991532 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.991670 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.491650729 +0000 UTC m=+201.028618653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.991790 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.992087 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.492076811 +0000 UTC m=+201.029044735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.079475 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.092542 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.092816 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.592785555 +0000 UTC m=+201.129753479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.092912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.093258 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.593238348 +0000 UTC m=+201.130206362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.193997 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.194225 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.694197249 +0000 UTC m=+201.231165173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.194550 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.194880 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.694869038 +0000 UTC m=+201.231836962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.210978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4wdxv" event={"ID":"730cba8e-b872-4ac3-a49c-57b789b21a3a","Type":"ContainerStarted","Data":"3a1b811425afafd6fce27171b8b9f56b26f4aa59e8fba86589e56c9ba69bf7b0"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.211028 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4wdxv" event={"ID":"730cba8e-b872-4ac3-a49c-57b789b21a3a","Type":"ContainerStarted","Data":"778a27035489117e115fa38461e41726b5b4f66bbb1db60276038fc2ade3feb2"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.211199 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.224325 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" event={"ID":"ff091d3e-230d-4911-9645-7de20d779b15","Type":"ContainerStarted","Data":"1dce4566fb4ec3a9ad9b4f1ee56b8ade5f2ba0031b6e26bb7700bf0b13a3b5c3"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.251084 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" event={"ID":"1987ed24-91bb-4ba3-afb2-807c5a25de00","Type":"ContainerStarted","Data":"32ffb49ca6efd00172189e40799bbcda3b835e93b5bcb5cb9e1605dd4c40d338"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.266038 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" event={"ID":"f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a","Type":"ContainerStarted","Data":"d89d62f1702e47c7c627e3db8bbc19562cd82dae44740124f888c46762e5716e"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.266082 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" event={"ID":"f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a","Type":"ContainerStarted","Data":"775a400e7b1fdcac5f3ddf13664a31f3889e55d4a82107049d58e71e4fdffb08"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.288280 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.288655 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.292377 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" event={"ID":"c5f2964d-4206-4278-b5d2-e772e79ec1c9","Type":"ContainerStarted","Data":"d7abdc06de9b13bb3c221f9ab1fba83f8e466669895db20f4693f727febb7e3f"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.296417 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.297668 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.797647671 +0000 UTC m=+201.334615595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.303386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" event={"ID":"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4","Type":"ContainerStarted","Data":"2ea671f4f99b94afa78da1ae4947cc47176a958123bfa251e53975323ea7caa1"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.316163 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bc7lz" event={"ID":"95c9eee6-d445-441c-bd33-67606423203e","Type":"ContainerStarted","Data":"b30cbc19f18dc9c46164e568e0b2394513a460495cb81ee7f8641b269e9bb3c3"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.324662 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.324734 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.327131 4722 patch_prober.go:28] interesting pod/apiserver-76f77b778f-ffc6x container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.327189 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" podUID="1987ed24-91bb-4ba3-afb2-807c5a25de00" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.327273 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" event={"ID":"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56","Type":"ContainerStarted","Data":"feed35cc71c3414fe2ea3ee68c876ed1680c7c4909814ce4cdb3da22e44e7a67"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.327318 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" event={"ID":"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56","Type":"ContainerStarted","Data":"03255316dd69d7f12519f23592c0b1ab781ccadbacbfe2d7f9fd9b994a4a82c6"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.347848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" event={"ID":"3fa0cb53-bdbe-4090-a508-b668e388ab57","Type":"ContainerStarted","Data":"ff3bb7c51159dd9c436736aa71d5f9966f2c1d0fc233209806bd9b3d2f408ac1"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.382687 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" event={"ID":"108ac542-c708-437b-8538-9b20337835ce","Type":"ContainerStarted","Data":"8c1dc42ec9a1c17202589910a3dfa2e62ce71a9660f1c12e88a03bb0efc695a5"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.392848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" event={"ID":"d02d2f96-f341-476f-b9ce-c9cd482386f1","Type":"ContainerStarted","Data":"ced68cea0785c9746a2c7827e8ab69bb719477720aa04109690f3cfc94235d91"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.398651 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.398967 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.898956401 +0000 UTC m=+201.435924325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.410385 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" podStartSLOduration=151.410368916 podStartE2EDuration="2m31.410368916s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:38.367300822 +0000 UTC m=+200.904268756" watchObservedRunningTime="2026-02-26 19:57:38.410368916 +0000 UTC m=+200.947336840" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.411090 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bc7lz" podStartSLOduration=7.411084936 podStartE2EDuration="7.411084936s" podCreationTimestamp="2026-02-26 19:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:38.41051267 +0000 UTC m=+200.947480594" watchObservedRunningTime="2026-02-26 19:57:38.411084936 +0000 UTC m=+200.948052860" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.459575 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4wdxv" podStartSLOduration=7.459558765 podStartE2EDuration="7.459558765s" podCreationTimestamp="2026-02-26 19:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:38.458038072 +0000 UTC m=+200.995005996" watchObservedRunningTime="2026-02-26 19:57:38.459558765 +0000 UTC m=+200.996526689" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.462288 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" event={"ID":"08beba96-a728-482a-ba00-5a630ca65d01","Type":"ContainerStarted","Data":"dacfa9b1849d70d0ea563978a4f79a52907e172cdda13689ecc34d3e34714ccb"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.462328 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" event={"ID":"08beba96-a728-482a-ba00-5a630ca65d01","Type":"ContainerStarted","Data":"117361e55413f92a3c0193570bf74d2e46b6e034ceb93993faac08a3cb5999ee"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.462768 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" podUID="597fba49-4fb4-4060-af46-9b6fc47c89fc" containerName="route-controller-manager" containerID="cri-o://80741e000bfa05d4d2e412c24e24e95cbf05ce7b76d6a90d97465d4732bb06ea" gracePeriod=30 Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.463958 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.466771 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" podUID="9a435401-5ccb-4811-bfd2-92826aa8fa63" containerName="controller-manager" containerID="cri-o://5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451" gracePeriod=30 Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.501624 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.501708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.502393 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.002377433 +0000 UTC m=+201.539345357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.597483 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:38 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:38 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:38 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.597792 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.603906 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.604257 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.104245429 +0000 UTC m=+201.641213343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.617698 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" podStartSLOduration=151.617682881 podStartE2EDuration="2m31.617682881s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:38.616763936 +0000 UTC m=+201.153731860" watchObservedRunningTime="2026-02-26 19:57:38.617682881 +0000 UTC m=+201.154650795" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.618672 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" podStartSLOduration=151.618664049 podStartE2EDuration="2m31.618664049s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:38.575858312 +0000 UTC m=+201.112826246" watchObservedRunningTime="2026-02-26 19:57:38.618664049 +0000 UTC m=+201.155631973" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.626912 4722 ???:1] "http: TLS handshake error from 192.168.126.11:45408: no serving certificate available for the kubelet" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.658387 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.711584 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.711944 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.211930062 +0000 UTC m=+201.748897986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.781766 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" podStartSLOduration=151.781748038 podStartE2EDuration="2m31.781748038s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:38.709574645 +0000 UTC m=+201.246542579" watchObservedRunningTime="2026-02-26 19:57:38.781748038 +0000 UTC m=+201.318715962" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.817420 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.817739 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.3177266 +0000 UTC m=+201.854694524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.918646 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.918966 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.418952929 +0000 UTC m=+201.955920843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.919121 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.977060 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.020887 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.021200 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.521188226 +0000 UTC m=+202.058156150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.121897 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.122149 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.622108986 +0000 UTC m=+202.159076910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.122642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.123014 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.622997311 +0000 UTC m=+202.159965235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.201317 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2llb2"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.202177 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.209582 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.216215 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2llb2"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.225443 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.225559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z29x\" (UniqueName: \"kubernetes.io/projected/4610ca54-dc80-47ad-b90f-61dffe47a076-kube-api-access-4z29x\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.225638 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-utilities\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.225680 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-catalog-content\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.226030 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.726016291 +0000 UTC m=+202.262984215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.324620 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.327846 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z29x\" (UniqueName: \"kubernetes.io/projected/4610ca54-dc80-47ad-b90f-61dffe47a076-kube-api-access-4z29x\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.327939 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-utilities\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.327986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-catalog-content\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.328007 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.328363 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.828352562 +0000 UTC m=+202.365320486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.329088 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-utilities\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.329380 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-catalog-content\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.357070 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z29x\" (UniqueName: \"kubernetes.io/projected/4610ca54-dc80-47ad-b90f-61dffe47a076-kube-api-access-4z29x\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.378586 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jpsrd"] Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.378824 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a435401-5ccb-4811-bfd2-92826aa8fa63" containerName="controller-manager" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.378842 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a435401-5ccb-4811-bfd2-92826aa8fa63" containerName="controller-manager" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.378951 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a435401-5ccb-4811-bfd2-92826aa8fa63" containerName="controller-manager" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.382902 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.382983 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.390175 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.400827 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bb84c5c65-rffgz"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.401650 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.407367 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jpsrd"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.436618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khjdl\" (UniqueName: \"kubernetes.io/projected/9a435401-5ccb-4811-bfd2-92826aa8fa63-kube-api-access-khjdl\") pod \"9a435401-5ccb-4811-bfd2-92826aa8fa63\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.436861 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.436924 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-proxy-ca-bundles\") pod \"9a435401-5ccb-4811-bfd2-92826aa8fa63\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.436988 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-config\") pod \"9a435401-5ccb-4811-bfd2-92826aa8fa63\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437034 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a435401-5ccb-4811-bfd2-92826aa8fa63-serving-cert\") pod \"9a435401-5ccb-4811-bfd2-92826aa8fa63\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437100 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-client-ca\") pod \"9a435401-5ccb-4811-bfd2-92826aa8fa63\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437493 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-utilities\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25f20fe-a151-472b-8bef-cf469ec73b38-serving-cert\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437675 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-catalog-content\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437732 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-config\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437808 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-proxy-ca-bundles\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437842 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54vwk\" (UniqueName: \"kubernetes.io/projected/94176c67-3742-4347-83c8-d467d4eb6be7-kube-api-access-54vwk\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437966 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-client-ca\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.438003 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7xpq\" (UniqueName: \"kubernetes.io/projected/a25f20fe-a151-472b-8bef-cf469ec73b38-kube-api-access-q7xpq\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.439406 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.939377748 +0000 UTC m=+202.476345672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.439861 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-config" (OuterVolumeSpecName: "config") pod "9a435401-5ccb-4811-bfd2-92826aa8fa63" (UID: "9a435401-5ccb-4811-bfd2-92826aa8fa63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.439887 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-client-ca" (OuterVolumeSpecName: "client-ca") pod "9a435401-5ccb-4811-bfd2-92826aa8fa63" (UID: "9a435401-5ccb-4811-bfd2-92826aa8fa63"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.439950 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bb84c5c65-rffgz"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.440372 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9a435401-5ccb-4811-bfd2-92826aa8fa63" (UID: "9a435401-5ccb-4811-bfd2-92826aa8fa63"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.468274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a435401-5ccb-4811-bfd2-92826aa8fa63-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9a435401-5ccb-4811-bfd2-92826aa8fa63" (UID: "9a435401-5ccb-4811-bfd2-92826aa8fa63"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.468988 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a435401-5ccb-4811-bfd2-92826aa8fa63-kube-api-access-khjdl" (OuterVolumeSpecName: "kube-api-access-khjdl") pod "9a435401-5ccb-4811-bfd2-92826aa8fa63" (UID: "9a435401-5ccb-4811-bfd2-92826aa8fa63"). InnerVolumeSpecName "kube-api-access-khjdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.495209 4722 generic.go:334] "Generic (PLEG): container finished" podID="9a435401-5ccb-4811-bfd2-92826aa8fa63" containerID="5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451" exitCode=0 Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.495284 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" event={"ID":"9a435401-5ccb-4811-bfd2-92826aa8fa63","Type":"ContainerDied","Data":"5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451"} Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.495317 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" event={"ID":"9a435401-5ccb-4811-bfd2-92826aa8fa63","Type":"ContainerDied","Data":"7edb51afac751cd6bd9eeebb7fe8eca97e5c451376b3ba5cf7db2672829e5803"} Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.495334 4722 scope.go:117] "RemoveContainer" containerID="5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.495488 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.507550 4722 generic.go:334] "Generic (PLEG): container finished" podID="597fba49-4fb4-4060-af46-9b6fc47c89fc" containerID="80741e000bfa05d4d2e412c24e24e95cbf05ce7b76d6a90d97465d4732bb06ea" exitCode=0 Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.508585 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" event={"ID":"597fba49-4fb4-4060-af46-9b6fc47c89fc","Type":"ContainerDied","Data":"80741e000bfa05d4d2e412c24e24e95cbf05ce7b76d6a90d97465d4732bb06ea"} Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.540129 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-client-ca\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.540243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7xpq\" (UniqueName: \"kubernetes.io/projected/a25f20fe-a151-472b-8bef-cf469ec73b38-kube-api-access-q7xpq\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.540304 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-utilities\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.540328 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.540422 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25f20fe-a151-472b-8bef-cf469ec73b38-serving-cert\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.542019 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-client-ca\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.542644 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.542713 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.042697957 +0000 UTC m=+202.579665881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.542938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-utilities\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-catalog-content\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-config\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543615 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-catalog-content\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-proxy-ca-bundles\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543709 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54vwk\" (UniqueName: \"kubernetes.io/projected/94176c67-3742-4347-83c8-d467d4eb6be7-kube-api-access-54vwk\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543782 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543799 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khjdl\" (UniqueName: \"kubernetes.io/projected/9a435401-5ccb-4811-bfd2-92826aa8fa63-kube-api-access-khjdl\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543809 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543819 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543828 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a435401-5ccb-4811-bfd2-92826aa8fa63-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.545425 4722 scope.go:117] "RemoveContainer" containerID="5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.546553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-proxy-ca-bundles\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.547546 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-config\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.548474 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25f20fe-a151-472b-8bef-cf469ec73b38-serving-cert\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.549745 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.561636 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451\": container with ID starting with 5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451 not found: ID does not exist" containerID="5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.561679 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451"} err="failed to get container status \"5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451\": rpc error: code = NotFound desc = could not find container \"5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451\": container with ID starting with 5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451 not found: ID does not exist" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.579602 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7xpq\" (UniqueName: \"kubernetes.io/projected/a25f20fe-a151-472b-8bef-cf469ec73b38-kube-api-access-q7xpq\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.579654 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lrsc8"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.580253 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54vwk\" (UniqueName: \"kubernetes.io/projected/94176c67-3742-4347-83c8-d467d4eb6be7-kube-api-access-54vwk\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.582576 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mr2wq"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.583643 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.585347 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lrsc8"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.591113 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mr2wq"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.591287 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:39 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:39 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:39 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.591318 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.646365 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.649453 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.149436512 +0000 UTC m=+202.686404436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.650744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.651520 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.151506801 +0000 UTC m=+202.688474725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.707176 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.742979 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.751394 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.751696 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-utilities\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.751752 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9q4b\" (UniqueName: \"kubernetes.io/projected/a72d6495-480f-419e-8b34-b02106e7e279-kube-api-access-l9q4b\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.751851 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.251833714 +0000 UTC m=+202.788801638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.751878 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-catalog-content\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.777371 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9vmx6"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.782347 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.789974 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vmx6"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.852975 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-catalog-content\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.853315 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-utilities\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.853394 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.853433 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-utilities\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.853454 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-catalog-content\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.853479 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwmck\" (UniqueName: \"kubernetes.io/projected/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-kube-api-access-jwmck\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.853540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9q4b\" (UniqueName: \"kubernetes.io/projected/a72d6495-480f-419e-8b34-b02106e7e279-kube-api-access-l9q4b\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.854370 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-catalog-content\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.854665 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.354653218 +0000 UTC m=+202.891621142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.873624 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-utilities\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.883809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9q4b\" (UniqueName: \"kubernetes.io/projected/a72d6495-480f-419e-8b34-b02106e7e279-kube-api-access-l9q4b\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.928988 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.957612 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.957889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-catalog-content\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.957915 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwmck\" (UniqueName: \"kubernetes.io/projected/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-kube-api-access-jwmck\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.957987 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-utilities\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.958418 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.458375668 +0000 UTC m=+202.995343592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.958431 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-utilities\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.958657 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.959596 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-catalog-content\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.985732 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwmck\" (UniqueName: \"kubernetes.io/projected/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-kube-api-access-jwmck\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.058822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-config\") pod \"597fba49-4fb4-4060-af46-9b6fc47c89fc\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.058884 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-client-ca\") pod \"597fba49-4fb4-4060-af46-9b6fc47c89fc\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.058921 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597fba49-4fb4-4060-af46-9b6fc47c89fc-serving-cert\") pod \"597fba49-4fb4-4060-af46-9b6fc47c89fc\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.058943 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v726n\" (UniqueName: \"kubernetes.io/projected/597fba49-4fb4-4060-af46-9b6fc47c89fc-kube-api-access-v726n\") pod \"597fba49-4fb4-4060-af46-9b6fc47c89fc\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.059225 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.059537 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.559525014 +0000 UTC m=+203.096492938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.059942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-client-ca" (OuterVolumeSpecName: "client-ca") pod "597fba49-4fb4-4060-af46-9b6fc47c89fc" (UID: "597fba49-4fb4-4060-af46-9b6fc47c89fc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.060375 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-config" (OuterVolumeSpecName: "config") pod "597fba49-4fb4-4060-af46-9b6fc47c89fc" (UID: "597fba49-4fb4-4060-af46-9b6fc47c89fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.072623 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/597fba49-4fb4-4060-af46-9b6fc47c89fc-kube-api-access-v726n" (OuterVolumeSpecName: "kube-api-access-v726n") pod "597fba49-4fb4-4060-af46-9b6fc47c89fc" (UID: "597fba49-4fb4-4060-af46-9b6fc47c89fc"). InnerVolumeSpecName "kube-api-access-v726n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.074892 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/597fba49-4fb4-4060-af46-9b6fc47c89fc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "597fba49-4fb4-4060-af46-9b6fc47c89fc" (UID: "597fba49-4fb4-4060-af46-9b6fc47c89fc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.112592 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2llb2"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.142521 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.161653 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.161929 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.161940 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597fba49-4fb4-4060-af46-9b6fc47c89fc-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.161950 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v726n\" (UniqueName: \"kubernetes.io/projected/597fba49-4fb4-4060-af46-9b6fc47c89fc-kube-api-access-v726n\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.161959 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.162018 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.662005218 +0000 UTC m=+203.198973142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.163213 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a435401-5ccb-4811-bfd2-92826aa8fa63" path="/var/lib/kubelet/pods/9a435401-5ccb-4811-bfd2-92826aa8fa63/volumes" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.202217 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jpsrd"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.264047 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.264383 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.764370089 +0000 UTC m=+203.301338013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.268266 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bb84c5c65-rffgz"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.325310 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mr2wq"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.365245 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.365759 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.865741712 +0000 UTC m=+203.402709636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.393943 4722 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.466690 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.467533 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.967514906 +0000 UTC m=+203.504482830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.474561 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vmx6"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.523704 4722 generic.go:334] "Generic (PLEG): container finished" podID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerID="6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4" exitCode=0 Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.523877 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2llb2" event={"ID":"4610ca54-dc80-47ad-b90f-61dffe47a076","Type":"ContainerDied","Data":"6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.523923 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2llb2" event={"ID":"4610ca54-dc80-47ad-b90f-61dffe47a076","Type":"ContainerStarted","Data":"4fca73ce71aaaf439cad76d8ce18fff9edf06fbb6f44d0268b5238e19b9fffd4"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.525863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerStarted","Data":"da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.525899 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerStarted","Data":"0dcf0c8eeb875944efbe43c423613d539f2d5a1406933217df05b755f6b605eb"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.533077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" event={"ID":"597fba49-4fb4-4060-af46-9b6fc47c89fc","Type":"ContainerDied","Data":"c6693039a26f1b3b8a9bf4f9668c75b740af7edf7702298a8aeea07ae2064704"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.533124 4722 scope.go:117] "RemoveContainer" containerID="80741e000bfa05d4d2e412c24e24e95cbf05ce7b76d6a90d97465d4732bb06ea" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.533334 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.535788 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" event={"ID":"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4","Type":"ContainerStarted","Data":"1a1531eb9a0be87ee1d19c1d62800785ba2d3efa8258ca52d04a45890e83a6ee"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.535816 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" event={"ID":"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4","Type":"ContainerStarted","Data":"9c75e7d57ce656f4de81e8391d832c0ca6941d64c9df11c6c02b312d048b8923"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.536945 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" event={"ID":"a25f20fe-a151-472b-8bef-cf469ec73b38","Type":"ContainerStarted","Data":"997bc5738c520dc1ff587018439c5d2532671cf08cdd28060a9f20d28ab60733"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.544163 4722 generic.go:334] "Generic (PLEG): container finished" podID="a13fa204-edf6-4e71-87c7-2a5d7603a100" containerID="88c213f62e12dbb0dd1f6360f1a6e19c1f15f5006140bee25ff8068b5724daf6" exitCode=0 Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.544186 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" event={"ID":"a13fa204-edf6-4e71-87c7-2a5d7603a100","Type":"ContainerDied","Data":"88c213f62e12dbb0dd1f6360f1a6e19c1f15f5006140bee25ff8068b5724daf6"} Feb 26 19:57:40 crc kubenswrapper[4722]: W0226 19:57:40.551424 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded54be4f_7a1d_4cf9_b7cc_9b7265667c02.slice/crio-1f7453cc19a5d54403cb0b3196ee6b07ae90acbffa8047c31afc0b1cc8f528a8 WatchSource:0}: Error finding container 1f7453cc19a5d54403cb0b3196ee6b07ae90acbffa8047c31afc0b1cc8f528a8: Status 404 returned error can't find the container with id 1f7453cc19a5d54403cb0b3196ee6b07ae90acbffa8047c31afc0b1cc8f528a8 Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.552724 4722 generic.go:334] "Generic (PLEG): container finished" podID="94176c67-3742-4347-83c8-d467d4eb6be7" containerID="5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332" exitCode=0 Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.553784 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsrd" event={"ID":"94176c67-3742-4347-83c8-d467d4eb6be7","Type":"ContainerDied","Data":"5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.553823 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsrd" event={"ID":"94176c67-3742-4347-83c8-d467d4eb6be7","Type":"ContainerStarted","Data":"12f48da69d094f4b7c738d277b25810015d5ccecbc024569a487139c88043f02"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.568503 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.568719 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.068695963 +0000 UTC m=+203.605663887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.571123 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.572254 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.072240105 +0000 UTC m=+203.609208029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.584204 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:40 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:40 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:40 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.584255 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.662177 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.666011 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll"] Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.672890 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.172856596 +0000 UTC m=+203.709824520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.672925 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.674878 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.675292 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.175263905 +0000 UTC m=+203.712231829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.744806 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.744990 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="597fba49-4fb4-4060-af46-9b6fc47c89fc" containerName="route-controller-manager" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.745006 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="597fba49-4fb4-4060-af46-9b6fc47c89fc" containerName="route-controller-manager" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.745107 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="597fba49-4fb4-4060-af46-9b6fc47c89fc" containerName="route-controller-manager" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.745434 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.748429 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.748599 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.752242 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.775220 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.775495 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.775678 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.775841 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.275818143 +0000 UTC m=+203.812786067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.876704 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.876826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.876859 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.877176 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.877383 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.377373362 +0000 UTC m=+203.914341286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.906054 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.977678 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.977916 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.477902301 +0000 UTC m=+204.014870215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.078577 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:41 crc kubenswrapper[4722]: E0226 19:57:41.078856 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.578844871 +0000 UTC m=+204.115812785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.102198 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.174870 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbwt"] Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.176360 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.179469 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.179683 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:41 crc kubenswrapper[4722]: E0226 19:57:41.179875 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.679852854 +0000 UTC m=+204.216820828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.180554 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:41 crc kubenswrapper[4722]: E0226 19:57:41.180894 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.680881643 +0000 UTC m=+204.217849567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.192482 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbwt"] Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.280590 4722 ???:1] "http: TLS handshake error from 192.168.126.11:45412: no serving certificate available for the kubelet" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.281503 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.281675 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-utilities\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.281703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-catalog-content\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.281721 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t4ng\" (UniqueName: \"kubernetes.io/projected/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-kube-api-access-7t4ng\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: E0226 19:57:41.281808 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.781793272 +0000 UTC m=+204.318761196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.357564 4722 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-26T19:57:40.394202301Z","Handler":null,"Name":""} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.383267 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.383351 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-utilities\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.383388 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-catalog-content\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.383411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t4ng\" (UniqueName: \"kubernetes.io/projected/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-kube-api-access-7t4ng\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.383429 4722 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.383465 4722 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.384283 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-utilities\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.384354 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-catalog-content\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.386419 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.386467 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.415436 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t4ng\" (UniqueName: \"kubernetes.io/projected/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-kube-api-access-7t4ng\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.422839 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.484110 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.491841 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.495281 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.498232 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.579813 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7ftb6"] Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.580927 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4261ad19-f7ca-47b6-bb12-0f03ece27d3e","Type":"ContainerStarted","Data":"974e58c1431517219ee22a6a3e98ab7ed17bff3380d60bdb1fcf8a2ab1fd25ed"} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.581008 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.587009 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-catalog-content\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.587064 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-257cl\" (UniqueName: \"kubernetes.io/projected/a3b9b627-4b55-435b-b34e-bda24686f969-kube-api-access-257cl\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.587103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-utilities\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.588796 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ftb6"] Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.590305 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:41 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:41 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:41 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.590349 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.590608 4722 generic.go:334] "Generic (PLEG): container finished" podID="a72d6495-480f-419e-8b34-b02106e7e279" containerID="da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9" exitCode=0 Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.590678 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerDied","Data":"da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9"} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.616501 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" event={"ID":"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4","Type":"ContainerStarted","Data":"cfea991f1816e0ce3f4efd4530dea9428222fd0aee6f38da5b18cf1d92a9ac39"} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.635260 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.658471 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" podStartSLOduration=10.658438093000001 podStartE2EDuration="10.658438093s" podCreationTimestamp="2026-02-26 19:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:41.655503249 +0000 UTC m=+204.192471173" watchObservedRunningTime="2026-02-26 19:57:41.658438093 +0000 UTC m=+204.195406017" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.666172 4722 generic.go:334] "Generic (PLEG): container finished" podID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerID="92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638" exitCode=0 Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.666243 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vmx6" event={"ID":"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02","Type":"ContainerDied","Data":"92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638"} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.666270 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vmx6" event={"ID":"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02","Type":"ContainerStarted","Data":"1f7453cc19a5d54403cb0b3196ee6b07ae90acbffa8047c31afc0b1cc8f528a8"} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.688430 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-utilities\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.688559 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-catalog-content\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.688598 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-257cl\" (UniqueName: \"kubernetes.io/projected/a3b9b627-4b55-435b-b34e-bda24686f969-kube-api-access-257cl\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.688896 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-utilities\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.689463 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-catalog-content\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.690937 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" event={"ID":"a25f20fe-a151-472b-8bef-cf469ec73b38","Type":"ContainerStarted","Data":"bc6672d439cdf247d4586656f222a72433b84588646e7dc03a8bef7988bd19de"} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.690968 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.714407 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-257cl\" (UniqueName: \"kubernetes.io/projected/a3b9b627-4b55-435b-b34e-bda24686f969-kube-api-access-257cl\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.724546 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" podStartSLOduration=5.724527913 podStartE2EDuration="5.724527913s" podCreationTimestamp="2026-02-26 19:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:41.722119044 +0000 UTC m=+204.259086968" watchObservedRunningTime="2026-02-26 19:57:41.724527913 +0000 UTC m=+204.261495837" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.751114 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.903434 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.059924 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.091552 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn"] Feb 26 19:57:42 crc kubenswrapper[4722]: E0226 19:57:42.091747 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13fa204-edf6-4e71-87c7-2a5d7603a100" containerName="collect-profiles" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.091758 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13fa204-edf6-4e71-87c7-2a5d7603a100" containerName="collect-profiles" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.091855 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13fa204-edf6-4e71-87c7-2a5d7603a100" containerName="collect-profiles" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.093433 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.096564 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.096730 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.096830 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.096971 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.097787 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.097942 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.099770 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcbmp\" (UniqueName: \"kubernetes.io/projected/a13fa204-edf6-4e71-87c7-2a5d7603a100-kube-api-access-qcbmp\") pod \"a13fa204-edf6-4e71-87c7-2a5d7603a100\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.099839 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13fa204-edf6-4e71-87c7-2a5d7603a100-config-volume\") pod \"a13fa204-edf6-4e71-87c7-2a5d7603a100\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.099877 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13fa204-edf6-4e71-87c7-2a5d7603a100-secret-volume\") pod \"a13fa204-edf6-4e71-87c7-2a5d7603a100\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.100070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-client-ca\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.100123 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-config\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.100156 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3594a-f3a8-45ae-a45c-a0dc59434864-serving-cert\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.100183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5sj7\" (UniqueName: \"kubernetes.io/projected/67e3594a-f3a8-45ae-a45c-a0dc59434864-kube-api-access-c5sj7\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.118983 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13fa204-edf6-4e71-87c7-2a5d7603a100-kube-api-access-qcbmp" (OuterVolumeSpecName: "kube-api-access-qcbmp") pod "a13fa204-edf6-4e71-87c7-2a5d7603a100" (UID: "a13fa204-edf6-4e71-87c7-2a5d7603a100"). InnerVolumeSpecName "kube-api-access-qcbmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.136876 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13fa204-edf6-4e71-87c7-2a5d7603a100-config-volume" (OuterVolumeSpecName: "config-volume") pod "a13fa204-edf6-4e71-87c7-2a5d7603a100" (UID: "a13fa204-edf6-4e71-87c7-2a5d7603a100"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.144553 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.147260 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13fa204-edf6-4e71-87c7-2a5d7603a100-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a13fa204-edf6-4e71-87c7-2a5d7603a100" (UID: "a13fa204-edf6-4e71-87c7-2a5d7603a100"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.174997 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="597fba49-4fb4-4060-af46-9b6fc47c89fc" path="/var/lib/kubelet/pods/597fba49-4fb4-4060-af46-9b6fc47c89fc/volumes" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.175798 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.177580 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fw46l"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.202874 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbwt"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.204797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-config\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.204845 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3594a-f3a8-45ae-a45c-a0dc59434864-serving-cert\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.204875 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5sj7\" (UniqueName: \"kubernetes.io/projected/67e3594a-f3a8-45ae-a45c-a0dc59434864-kube-api-access-c5sj7\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.205828 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-client-ca\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.206040 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13fa204-edf6-4e71-87c7-2a5d7603a100-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.206063 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13fa204-edf6-4e71-87c7-2a5d7603a100-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.206079 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcbmp\" (UniqueName: \"kubernetes.io/projected/a13fa204-edf6-4e71-87c7-2a5d7603a100-kube-api-access-qcbmp\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.206321 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-config\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.207074 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-client-ca\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.210111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3594a-f3a8-45ae-a45c-a0dc59434864-serving-cert\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.234710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5sj7\" (UniqueName: \"kubernetes.io/projected/67e3594a-f3a8-45ae-a45c-a0dc59434864-kube-api-access-c5sj7\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.272857 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ftb6"] Feb 26 19:57:42 crc kubenswrapper[4722]: W0226 19:57:42.300311 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3b9b627_4b55_435b_b34e_bda24686f969.slice/crio-7cd3e66f2bc98cc3227423922b3b895dc9b1d5f28702f947c7a56274ae80cae2 WatchSource:0}: Error finding container 7cd3e66f2bc98cc3227423922b3b895dc9b1d5f28702f947c7a56274ae80cae2: Status 404 returned error can't find the container with id 7cd3e66f2bc98cc3227423922b3b895dc9b1d5f28702f947c7a56274ae80cae2 Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.423126 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.570744 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fn7tr"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.571704 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.574028 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.586273 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:42 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:42 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:42 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.586334 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.595788 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fn7tr"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.699348 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4261ad19-f7ca-47b6-bb12-0f03ece27d3e","Type":"ContainerStarted","Data":"28b5b4929968a9b3d5c4c25160ed0e6d28cc5785f3ccbb450d373cfc12e5c028"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.702971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" event={"ID":"a13fa204-edf6-4e71-87c7-2a5d7603a100","Type":"ContainerDied","Data":"90f5c07c38e02227ba00789927ef16c1d77638f6e991d8dab7ffc70b8d28b552"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.703013 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90f5c07c38e02227ba00789927ef16c1d77638f6e991d8dab7ffc70b8d28b552" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.703067 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.707844 4722 generic.go:334] "Generic (PLEG): container finished" podID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerID="b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1" exitCode=0 Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.707928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbwt" event={"ID":"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e","Type":"ContainerDied","Data":"b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.707943 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbwt" event={"ID":"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e","Type":"ContainerStarted","Data":"10b9edd74c60c90742be9dacd2d93a4b35e0536412f2688a800dc04c6aa67ba9"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.710864 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-utilities\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.710961 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz7z5\" (UniqueName: \"kubernetes.io/projected/2299b352-9475-4e85-9a5b-cb08aea743c2-kube-api-access-tz7z5\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.711009 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-catalog-content\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.719095 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.719082385 podStartE2EDuration="2.719082385s" podCreationTimestamp="2026-02-26 19:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:42.717351206 +0000 UTC m=+205.254319140" watchObservedRunningTime="2026-02-26 19:57:42.719082385 +0000 UTC m=+205.256050309" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.726195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" event={"ID":"38bc8665-24b9-47b9-b7d2-0e45f55a0112","Type":"ContainerStarted","Data":"dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.726264 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" event={"ID":"38bc8665-24b9-47b9-b7d2-0e45f55a0112","Type":"ContainerStarted","Data":"5b613cb39b5bcd5c7a499190105759fdfd8d946463c6f500054844f082aa192b"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.726406 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.738497 4722 generic.go:334] "Generic (PLEG): container finished" podID="a3b9b627-4b55-435b-b34e-bda24686f969" containerID="1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178" exitCode=0 Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.738631 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ftb6" event={"ID":"a3b9b627-4b55-435b-b34e-bda24686f969","Type":"ContainerDied","Data":"1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.738717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ftb6" event={"ID":"a3b9b627-4b55-435b-b34e-bda24686f969","Type":"ContainerStarted","Data":"7cd3e66f2bc98cc3227423922b3b895dc9b1d5f28702f947c7a56274ae80cae2"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.764602 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" podStartSLOduration=155.764581969 podStartE2EDuration="2m35.764581969s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:42.760040001 +0000 UTC m=+205.297007935" watchObservedRunningTime="2026-02-26 19:57:42.764581969 +0000 UTC m=+205.301549893" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.812591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-utilities\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.812712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz7z5\" (UniqueName: \"kubernetes.io/projected/2299b352-9475-4e85-9a5b-cb08aea743c2-kube-api-access-tz7z5\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.812762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-catalog-content\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.813311 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-utilities\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.813327 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-catalog-content\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.844883 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz7z5\" (UniqueName: \"kubernetes.io/projected/2299b352-9475-4e85-9a5b-cb08aea743c2-kube-api-access-tz7z5\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.896653 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.922562 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.973691 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p4qbc"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.976278 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.984172 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p4qbc"] Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.081572 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-sbl7q container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.081683 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-sbl7q" podUID="ab76d410-2de1-47c9-a03c-be7a2b1fabab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.081709 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-sbl7q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.081884 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sbl7q" podUID="ab76d410-2de1-47c9-a03c-be7a2b1fabab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.116431 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-catalog-content\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.116668 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfg4l\" (UniqueName: \"kubernetes.io/projected/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-kube-api-access-kfg4l\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.116809 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-utilities\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.168800 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fn7tr"] Feb 26 19:57:43 crc kubenswrapper[4722]: W0226 19:57:43.200323 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2299b352_9475_4e85_9a5b_cb08aea743c2.slice/crio-a3f0e753684439dec6af77ce80288768378c0fdf34847bf9d0c6a937239c834a WatchSource:0}: Error finding container a3f0e753684439dec6af77ce80288768378c0fdf34847bf9d0c6a937239c834a: Status 404 returned error can't find the container with id a3f0e753684439dec6af77ce80288768378c0fdf34847bf9d0c6a937239c834a Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.218048 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfg4l\" (UniqueName: \"kubernetes.io/projected/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-kube-api-access-kfg4l\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.218166 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-utilities\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.218229 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-catalog-content\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.218805 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-catalog-content\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.219615 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-utilities\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.250694 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfg4l\" (UniqueName: \"kubernetes.io/projected/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-kube-api-access-kfg4l\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.311056 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.333422 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.337461 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.521495 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.521544 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.525342 4722 patch_prober.go:28] interesting pod/console-f9d7485db-n77d2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.525396 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-n77d2" podUID="46842c31-3b12-4cbf-b722-327327cf8375" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.585362 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:43 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:43 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:43 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.585419 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.782868 4722 generic.go:334] "Generic (PLEG): container finished" podID="4261ad19-f7ca-47b6-bb12-0f03ece27d3e" containerID="28b5b4929968a9b3d5c4c25160ed0e6d28cc5785f3ccbb450d373cfc12e5c028" exitCode=0 Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.783275 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4261ad19-f7ca-47b6-bb12-0f03ece27d3e","Type":"ContainerDied","Data":"28b5b4929968a9b3d5c4c25160ed0e6d28cc5785f3ccbb450d373cfc12e5c028"} Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.804039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" event={"ID":"67e3594a-f3a8-45ae-a45c-a0dc59434864","Type":"ContainerStarted","Data":"2360d56da38e511e9fb806a0f5a22c9f5a5ebd2d91108ed7684d0dbf663fd18f"} Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.804111 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" event={"ID":"67e3594a-f3a8-45ae-a45c-a0dc59434864","Type":"ContainerStarted","Data":"d6f307dcca0db1da0fd43df6eb9e2c34742a8b4a52adf6cc27e982f2edb93466"} Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.805213 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.808335 4722 generic.go:334] "Generic (PLEG): container finished" podID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerID="a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f" exitCode=0 Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.809166 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerDied","Data":"a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f"} Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.809186 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerStarted","Data":"a3f0e753684439dec6af77ce80288768378c0fdf34847bf9d0c6a937239c834a"} Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.820856 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p4qbc"] Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.837670 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" podStartSLOduration=7.837647755 podStartE2EDuration="7.837647755s" podCreationTimestamp="2026-02-26 19:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:43.829474872 +0000 UTC m=+206.366442806" watchObservedRunningTime="2026-02-26 19:57:43.837647755 +0000 UTC m=+206.374615679" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.942604 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.943593 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.945223 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.947443 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.965100 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.035755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d33dabf-78a5-4411-80dc-b8793bb36d08-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.037577 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d33dabf-78a5-4411-80dc-b8793bb36d08-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.060348 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.139186 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d33dabf-78a5-4411-80dc-b8793bb36d08-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.139344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d33dabf-78a5-4411-80dc-b8793bb36d08-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.139647 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d33dabf-78a5-4411-80dc-b8793bb36d08-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.160382 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d33dabf-78a5-4411-80dc-b8793bb36d08-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.273950 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.581157 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.584975 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:44 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:44 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:44 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.585094 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.733225 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.822545 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8d33dabf-78a5-4411-80dc-b8793bb36d08","Type":"ContainerStarted","Data":"5f751a34d90faabe9bc1b3d8fc567a041143b1d49ff550a30aa478e5f9b1ce67"} Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.826801 4722 generic.go:334] "Generic (PLEG): container finished" podID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerID="a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b" exitCode=0 Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.826847 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerDied","Data":"a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b"} Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.826908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerStarted","Data":"6640be0fb17f1e8ff94ba19db8f3f2a3eb0875cbbe9514eebc261458ec3bff56"} Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.344739 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.466410 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kube-api-access\") pod \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.466473 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kubelet-dir\") pod \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.466634 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4261ad19-f7ca-47b6-bb12-0f03ece27d3e" (UID: "4261ad19-f7ca-47b6-bb12-0f03ece27d3e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.467016 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.471803 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4261ad19-f7ca-47b6-bb12-0f03ece27d3e" (UID: "4261ad19-f7ca-47b6-bb12-0f03ece27d3e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.567869 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.584527 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:45 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:45 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:45 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.584579 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.842809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8d33dabf-78a5-4411-80dc-b8793bb36d08","Type":"ContainerStarted","Data":"9e6940bda78a123d6a3e898bf31607521dd130f584fca420a0b535cbd547c43c"} Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.852090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4261ad19-f7ca-47b6-bb12-0f03ece27d3e","Type":"ContainerDied","Data":"974e58c1431517219ee22a6a3e98ab7ed17bff3380d60bdb1fcf8a2ab1fd25ed"} Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.852151 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="974e58c1431517219ee22a6a3e98ab7ed17bff3380d60bdb1fcf8a2ab1fd25ed" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.852114 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:46 crc kubenswrapper[4722]: I0226 19:57:46.433216 4722 ???:1] "http: TLS handshake error from 192.168.126.11:45422: no serving certificate available for the kubelet" Feb 26 19:57:46 crc kubenswrapper[4722]: I0226 19:57:46.589331 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:46 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:46 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:46 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:46 crc kubenswrapper[4722]: I0226 19:57:46.589383 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:46 crc kubenswrapper[4722]: I0226 19:57:46.873118 4722 generic.go:334] "Generic (PLEG): container finished" podID="8d33dabf-78a5-4411-80dc-b8793bb36d08" containerID="9e6940bda78a123d6a3e898bf31607521dd130f584fca420a0b535cbd547c43c" exitCode=0 Feb 26 19:57:46 crc kubenswrapper[4722]: I0226 19:57:46.873185 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8d33dabf-78a5-4411-80dc-b8793bb36d08","Type":"ContainerDied","Data":"9e6940bda78a123d6a3e898bf31607521dd130f584fca420a0b535cbd547c43c"} Feb 26 19:57:47 crc kubenswrapper[4722]: I0226 19:57:47.586323 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:47 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:47 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:47 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:47 crc kubenswrapper[4722]: I0226 19:57:47.586390 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:48 crc kubenswrapper[4722]: I0226 19:57:48.168265 4722 ???:1] "http: TLS handshake error from 192.168.126.11:45432: no serving certificate available for the kubelet" Feb 26 19:57:48 crc kubenswrapper[4722]: I0226 19:57:48.585204 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:48 crc kubenswrapper[4722]: I0226 19:57:48.587549 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:49 crc kubenswrapper[4722]: I0226 19:57:49.342344 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:53 crc kubenswrapper[4722]: I0226 19:57:53.085017 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:53 crc kubenswrapper[4722]: I0226 19:57:53.487839 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 19:57:53 crc kubenswrapper[4722]: I0226 19:57:53.487897 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 19:57:53 crc kubenswrapper[4722]: I0226 19:57:53.523005 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:53 crc kubenswrapper[4722]: I0226 19:57:53.530917 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:53 crc kubenswrapper[4722]: I0226 19:57:53.933791 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:57:56 crc kubenswrapper[4722]: I0226 19:57:56.142100 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bb84c5c65-rffgz"] Feb 26 19:57:56 crc kubenswrapper[4722]: I0226 19:57:56.142594 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" podUID="a25f20fe-a151-472b-8bef-cf469ec73b38" containerName="controller-manager" containerID="cri-o://bc6672d439cdf247d4586656f222a72433b84588646e7dc03a8bef7988bd19de" gracePeriod=30 Feb 26 19:57:56 crc kubenswrapper[4722]: I0226 19:57:56.166374 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn"] Feb 26 19:57:56 crc kubenswrapper[4722]: I0226 19:57:56.166671 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" podUID="67e3594a-f3a8-45ae-a45c-a0dc59434864" containerName="route-controller-manager" containerID="cri-o://2360d56da38e511e9fb806a0f5a22c9f5a5ebd2d91108ed7684d0dbf663fd18f" gracePeriod=30 Feb 26 19:57:56 crc kubenswrapper[4722]: I0226 19:57:56.700252 4722 ???:1] "http: TLS handshake error from 192.168.126.11:48974: no serving certificate available for the kubelet" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.212333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.212440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.212470 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.212487 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.222884 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.223223 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.223661 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.238947 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.295517 4722 generic.go:334] "Generic (PLEG): container finished" podID="a25f20fe-a151-472b-8bef-cf469ec73b38" containerID="bc6672d439cdf247d4586656f222a72433b84588646e7dc03a8bef7988bd19de" exitCode=0 Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.295573 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" event={"ID":"a25f20fe-a151-472b-8bef-cf469ec73b38","Type":"ContainerDied","Data":"bc6672d439cdf247d4586656f222a72433b84588646e7dc03a8bef7988bd19de"} Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.367898 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.383247 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.392029 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:59 crc kubenswrapper[4722]: I0226 19:57:59.744174 4722 patch_prober.go:28] interesting pod/controller-manager-6bb84c5c65-rffgz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Feb 26 19:57:59 crc kubenswrapper[4722]: I0226 19:57:59.744673 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" podUID="a25f20fe-a151-472b-8bef-cf469ec73b38" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.126561 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535598-7j7jd"] Feb 26 19:58:00 crc kubenswrapper[4722]: E0226 19:58:00.126756 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4261ad19-f7ca-47b6-bb12-0f03ece27d3e" containerName="pruner" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.126768 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4261ad19-f7ca-47b6-bb12-0f03ece27d3e" containerName="pruner" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.126862 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4261ad19-f7ca-47b6-bb12-0f03ece27d3e" containerName="pruner" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.127261 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.129191 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.166538 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535598-7j7jd"] Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.249807 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx4ll\" (UniqueName: \"kubernetes.io/projected/452039e5-ebab-456a-8ca8-045fa1b1c90a-kube-api-access-dx4ll\") pod \"auto-csr-approver-29535598-7j7jd\" (UID: \"452039e5-ebab-456a-8ca8-045fa1b1c90a\") " pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.351474 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx4ll\" (UniqueName: \"kubernetes.io/projected/452039e5-ebab-456a-8ca8-045fa1b1c90a-kube-api-access-dx4ll\") pod \"auto-csr-approver-29535598-7j7jd\" (UID: \"452039e5-ebab-456a-8ca8-045fa1b1c90a\") " pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.368757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx4ll\" (UniqueName: \"kubernetes.io/projected/452039e5-ebab-456a-8ca8-045fa1b1c90a-kube-api-access-dx4ll\") pod \"auto-csr-approver-29535598-7j7jd\" (UID: \"452039e5-ebab-456a-8ca8-045fa1b1c90a\") " pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.446522 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.729523 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.879643 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d33dabf-78a5-4411-80dc-b8793bb36d08-kube-api-access\") pod \"8d33dabf-78a5-4411-80dc-b8793bb36d08\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.880247 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d33dabf-78a5-4411-80dc-b8793bb36d08-kubelet-dir\") pod \"8d33dabf-78a5-4411-80dc-b8793bb36d08\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.880416 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d33dabf-78a5-4411-80dc-b8793bb36d08-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8d33dabf-78a5-4411-80dc-b8793bb36d08" (UID: "8d33dabf-78a5-4411-80dc-b8793bb36d08"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.880708 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d33dabf-78a5-4411-80dc-b8793bb36d08-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.882780 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d33dabf-78a5-4411-80dc-b8793bb36d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8d33dabf-78a5-4411-80dc-b8793bb36d08" (UID: "8d33dabf-78a5-4411-80dc-b8793bb36d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.982222 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d33dabf-78a5-4411-80dc-b8793bb36d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:01 crc kubenswrapper[4722]: I0226 19:58:01.317039 4722 generic.go:334] "Generic (PLEG): container finished" podID="67e3594a-f3a8-45ae-a45c-a0dc59434864" containerID="2360d56da38e511e9fb806a0f5a22c9f5a5ebd2d91108ed7684d0dbf663fd18f" exitCode=0 Feb 26 19:58:01 crc kubenswrapper[4722]: I0226 19:58:01.317119 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" event={"ID":"67e3594a-f3a8-45ae-a45c-a0dc59434864","Type":"ContainerDied","Data":"2360d56da38e511e9fb806a0f5a22c9f5a5ebd2d91108ed7684d0dbf663fd18f"} Feb 26 19:58:01 crc kubenswrapper[4722]: I0226 19:58:01.319803 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8d33dabf-78a5-4411-80dc-b8793bb36d08","Type":"ContainerDied","Data":"5f751a34d90faabe9bc1b3d8fc567a041143b1d49ff550a30aa478e5f9b1ce67"} Feb 26 19:58:01 crc kubenswrapper[4722]: I0226 19:58:01.319830 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f751a34d90faabe9bc1b3d8fc567a041143b1d49ff550a30aa478e5f9b1ce67" Feb 26 19:58:01 crc kubenswrapper[4722]: I0226 19:58:01.319853 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:58:01 crc kubenswrapper[4722]: I0226 19:58:01.641420 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.238723 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.245100 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265530 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7fbd685686-kdh5s"] Feb 26 19:58:03 crc kubenswrapper[4722]: E0226 19:58:03.265727 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d33dabf-78a5-4411-80dc-b8793bb36d08" containerName="pruner" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265738 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d33dabf-78a5-4411-80dc-b8793bb36d08" containerName="pruner" Feb 26 19:58:03 crc kubenswrapper[4722]: E0226 19:58:03.265753 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25f20fe-a151-472b-8bef-cf469ec73b38" containerName="controller-manager" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265758 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25f20fe-a151-472b-8bef-cf469ec73b38" containerName="controller-manager" Feb 26 19:58:03 crc kubenswrapper[4722]: E0226 19:58:03.265768 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e3594a-f3a8-45ae-a45c-a0dc59434864" containerName="route-controller-manager" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265773 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e3594a-f3a8-45ae-a45c-a0dc59434864" containerName="route-controller-manager" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265870 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25f20fe-a151-472b-8bef-cf469ec73b38" containerName="controller-manager" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265882 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d33dabf-78a5-4411-80dc-b8793bb36d08" containerName="pruner" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265889 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e3594a-f3a8-45ae-a45c-a0dc59434864" containerName="route-controller-manager" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.266244 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.288005 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fbd685686-kdh5s"] Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311407 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5sj7\" (UniqueName: \"kubernetes.io/projected/67e3594a-f3a8-45ae-a45c-a0dc59434864-kube-api-access-c5sj7\") pod \"67e3594a-f3a8-45ae-a45c-a0dc59434864\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311504 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25f20fe-a151-472b-8bef-cf469ec73b38-serving-cert\") pod \"a25f20fe-a151-472b-8bef-cf469ec73b38\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311535 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-proxy-ca-bundles\") pod \"a25f20fe-a151-472b-8bef-cf469ec73b38\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311562 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-config\") pod \"a25f20fe-a151-472b-8bef-cf469ec73b38\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311589 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-client-ca\") pod \"a25f20fe-a151-472b-8bef-cf469ec73b38\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311622 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-client-ca\") pod \"67e3594a-f3a8-45ae-a45c-a0dc59434864\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311654 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7xpq\" (UniqueName: \"kubernetes.io/projected/a25f20fe-a151-472b-8bef-cf469ec73b38-kube-api-access-q7xpq\") pod \"a25f20fe-a151-472b-8bef-cf469ec73b38\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311684 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3594a-f3a8-45ae-a45c-a0dc59434864-serving-cert\") pod \"67e3594a-f3a8-45ae-a45c-a0dc59434864\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311731 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-config\") pod \"67e3594a-f3a8-45ae-a45c-a0dc59434864\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.312906 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-client-ca" (OuterVolumeSpecName: "client-ca") pod "a25f20fe-a151-472b-8bef-cf469ec73b38" (UID: "a25f20fe-a151-472b-8bef-cf469ec73b38"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.312955 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-client-ca" (OuterVolumeSpecName: "client-ca") pod "67e3594a-f3a8-45ae-a45c-a0dc59434864" (UID: "67e3594a-f3a8-45ae-a45c-a0dc59434864"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.313008 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a25f20fe-a151-472b-8bef-cf469ec73b38" (UID: "a25f20fe-a151-472b-8bef-cf469ec73b38"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.313087 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-config" (OuterVolumeSpecName: "config") pod "a25f20fe-a151-472b-8bef-cf469ec73b38" (UID: "a25f20fe-a151-472b-8bef-cf469ec73b38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.313403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-config" (OuterVolumeSpecName: "config") pod "67e3594a-f3a8-45ae-a45c-a0dc59434864" (UID: "67e3594a-f3a8-45ae-a45c-a0dc59434864"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.317644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e3594a-f3a8-45ae-a45c-a0dc59434864-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67e3594a-f3a8-45ae-a45c-a0dc59434864" (UID: "67e3594a-f3a8-45ae-a45c-a0dc59434864"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.318871 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25f20fe-a151-472b-8bef-cf469ec73b38-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a25f20fe-a151-472b-8bef-cf469ec73b38" (UID: "a25f20fe-a151-472b-8bef-cf469ec73b38"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.320717 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a25f20fe-a151-472b-8bef-cf469ec73b38-kube-api-access-q7xpq" (OuterVolumeSpecName: "kube-api-access-q7xpq") pod "a25f20fe-a151-472b-8bef-cf469ec73b38" (UID: "a25f20fe-a151-472b-8bef-cf469ec73b38"). InnerVolumeSpecName "kube-api-access-q7xpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.333699 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e3594a-f3a8-45ae-a45c-a0dc59434864-kube-api-access-c5sj7" (OuterVolumeSpecName: "kube-api-access-c5sj7") pod "67e3594a-f3a8-45ae-a45c-a0dc59434864" (UID: "67e3594a-f3a8-45ae-a45c-a0dc59434864"). InnerVolumeSpecName "kube-api-access-c5sj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.339181 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.339650 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" event={"ID":"a25f20fe-a151-472b-8bef-cf469ec73b38","Type":"ContainerDied","Data":"997bc5738c520dc1ff587018439c5d2532671cf08cdd28060a9f20d28ab60733"} Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.340235 4722 scope.go:117] "RemoveContainer" containerID="bc6672d439cdf247d4586656f222a72433b84588646e7dc03a8bef7988bd19de" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.343114 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" event={"ID":"67e3594a-f3a8-45ae-a45c-a0dc59434864","Type":"ContainerDied","Data":"d6f307dcca0db1da0fd43df6eb9e2c34742a8b4a52adf6cc27e982f2edb93466"} Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.343208 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.373072 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bb84c5c65-rffgz"] Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.378002 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bb84c5c65-rffgz"] Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.380513 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn"] Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.382733 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn"] Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412674 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-client-ca\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412728 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-proxy-ca-bundles\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqqf\" (UniqueName: \"kubernetes.io/projected/08786ca5-a181-435b-88e6-1c6369f88eb0-kube-api-access-6mqqf\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412796 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08786ca5-a181-435b-88e6-1c6369f88eb0-serving-cert\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-config\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412916 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412944 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412954 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412963 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412971 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7xpq\" (UniqueName: \"kubernetes.io/projected/a25f20fe-a151-472b-8bef-cf469ec73b38-kube-api-access-q7xpq\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412983 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3594a-f3a8-45ae-a45c-a0dc59434864-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412992 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.413000 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5sj7\" (UniqueName: \"kubernetes.io/projected/67e3594a-f3a8-45ae-a45c-a0dc59434864-kube-api-access-c5sj7\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.413009 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25f20fe-a151-472b-8bef-cf469ec73b38-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.424533 4722 patch_prober.go:28] interesting pod/route-controller-manager-75f5777875-nvmrn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": context deadline exceeded" start-of-body= Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.424601 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" podUID="67e3594a-f3a8-45ae-a45c-a0dc59434864" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": context deadline exceeded" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.513702 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-proxy-ca-bundles\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.513753 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mqqf\" (UniqueName: \"kubernetes.io/projected/08786ca5-a181-435b-88e6-1c6369f88eb0-kube-api-access-6mqqf\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.513790 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08786ca5-a181-435b-88e6-1c6369f88eb0-serving-cert\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.513807 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-config\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.513857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-client-ca\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.514584 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-client-ca\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.515614 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-config\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.515662 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-proxy-ca-bundles\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.519287 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08786ca5-a181-435b-88e6-1c6369f88eb0-serving-cert\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.531464 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mqqf\" (UniqueName: \"kubernetes.io/projected/08786ca5-a181-435b-88e6-1c6369f88eb0-kube-api-access-6mqqf\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.600526 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:04 crc kubenswrapper[4722]: I0226 19:58:04.152712 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e3594a-f3a8-45ae-a45c-a0dc59434864" path="/var/lib/kubelet/pods/67e3594a-f3a8-45ae-a45c-a0dc59434864/volumes" Feb 26 19:58:04 crc kubenswrapper[4722]: I0226 19:58:04.153493 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a25f20fe-a151-472b-8bef-cf469ec73b38" path="/var/lib/kubelet/pods/a25f20fe-a151-472b-8bef-cf469ec73b38/volumes" Feb 26 19:58:04 crc kubenswrapper[4722]: E0226 19:58:04.381490 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 26 19:58:04 crc kubenswrapper[4722]: E0226 19:58:04.382002 4722 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 19:58:04 crc kubenswrapper[4722]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 26 19:58:04 crc kubenswrapper[4722]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b4dtd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29535596-sfmpl_openshift-infra(7c96e488-8450-4dff-ac4c-5ac9e210a9a6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 26 19:58:04 crc kubenswrapper[4722]: > logger="UnhandledError" Feb 26 19:58:04 crc kubenswrapper[4722]: E0226 19:58:04.383192 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" podUID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" Feb 26 19:58:05 crc kubenswrapper[4722]: E0226 19:58:05.355942 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" podUID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" Feb 26 19:58:07 crc kubenswrapper[4722]: I0226 19:58:07.825639 4722 scope.go:117] "RemoveContainer" containerID="2360d56da38e511e9fb806a0f5a22c9f5a5ebd2d91108ed7684d0dbf663fd18f" Feb 26 19:58:07 crc kubenswrapper[4722]: W0226 19:58:07.830894 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-83e01633e98d5fd58477426a44dd85ab64b40783953c6fc140115c57b2c204b0 WatchSource:0}: Error finding container 83e01633e98d5fd58477426a44dd85ab64b40783953c6fc140115c57b2c204b0: Status 404 returned error can't find the container with id 83e01633e98d5fd58477426a44dd85ab64b40783953c6fc140115c57b2c204b0 Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.117929 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv"] Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.119632 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.122274 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.122481 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.122665 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.122816 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.122973 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.123124 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.131799 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv"] Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.187478 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535598-7j7jd"] Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.280962 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-client-ca\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.281035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-config\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.281078 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvgc8\" (UniqueName: \"kubernetes.io/projected/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-kube-api-access-jvgc8\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.281127 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-serving-cert\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.370588 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"83e01633e98d5fd58477426a44dd85ab64b40783953c6fc140115c57b2c204b0"} Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.382302 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-client-ca\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.382344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-config\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.382375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvgc8\" (UniqueName: \"kubernetes.io/projected/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-kube-api-access-jvgc8\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.382411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-serving-cert\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.384727 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-config\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.384952 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-client-ca\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.388428 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-serving-cert\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.398648 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvgc8\" (UniqueName: \"kubernetes.io/projected/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-kube-api-access-jvgc8\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.445833 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:09 crc kubenswrapper[4722]: W0226 19:58:09.497050 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-56323b23be720e23f9c64f0679f528696edf63eeecbf02df6fa01f469f8f5933 WatchSource:0}: Error finding container 56323b23be720e23f9c64f0679f528696edf63eeecbf02df6fa01f469f8f5933: Status 404 returned error can't find the container with id 56323b23be720e23f9c64f0679f528696edf63eeecbf02df6fa01f469f8f5933 Feb 26 19:58:10 crc kubenswrapper[4722]: I0226 19:58:10.379016 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"db5fed8bd87fcf8e768f16c8de557233117c56d7f2d9e510742d4b3e1615ac1c"} Feb 26 19:58:10 crc kubenswrapper[4722]: I0226 19:58:10.379905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"56323b23be720e23f9c64f0679f528696edf63eeecbf02df6fa01f469f8f5933"} Feb 26 19:58:10 crc kubenswrapper[4722]: W0226 19:58:10.676001 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod452039e5_ebab_456a_8ca8_045fa1b1c90a.slice/crio-b8a25b0a8ec7712d3382eea466d8636d54b458d957cf438370008ad7c1ad98e9 WatchSource:0}: Error finding container b8a25b0a8ec7712d3382eea466d8636d54b458d957cf438370008ad7c1ad98e9: Status 404 returned error can't find the container with id b8a25b0a8ec7712d3382eea466d8636d54b458d957cf438370008ad7c1ad98e9 Feb 26 19:58:11 crc kubenswrapper[4722]: I0226 19:58:11.386628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" event={"ID":"452039e5-ebab-456a-8ca8-045fa1b1c90a","Type":"ContainerStarted","Data":"b8a25b0a8ec7712d3382eea466d8636d54b458d957cf438370008ad7c1ad98e9"} Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.324760 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.324922 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwmck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9vmx6_openshift-marketplace(ed54be4f-7a1d-4cf9-b7cc-9b7265667c02): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.326204 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9vmx6" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.713989 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9vmx6" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.772677 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.772827 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7t4ng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jxbwt_openshift-marketplace(db7129a7-c8b2-44c5-8133-cb1d47bbdd4e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.773982 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jxbwt" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" Feb 26 19:58:14 crc kubenswrapper[4722]: I0226 19:58:14.068381 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:58:14 crc kubenswrapper[4722]: I0226 19:58:14.075630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:58:14 crc kubenswrapper[4722]: I0226 19:58:14.200292 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:58:14 crc kubenswrapper[4722]: I0226 19:58:14.220669 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:58:16 crc kubenswrapper[4722]: I0226 19:58:16.125005 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fbd685686-kdh5s"] Feb 26 19:58:16 crc kubenswrapper[4722]: I0226 19:58:16.231389 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv"] Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.638480 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jxbwt" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.805430 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.805804 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-54vwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jpsrd_openshift-marketplace(94176c67-3742-4347-83c8-d467d4eb6be7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.806965 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jpsrd" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" Feb 26 19:58:16 crc kubenswrapper[4722]: I0226 19:58:16.856611 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fbd685686-kdh5s"] Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.861786 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.861913 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tz7z5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fn7tr_openshift-marketplace(2299b352-9475-4e85-9a5b-cb08aea743c2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.863304 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fn7tr" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" Feb 26 19:58:16 crc kubenswrapper[4722]: W0226 19:58:16.879264 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08786ca5_a181_435b_88e6_1c6369f88eb0.slice/crio-d3579512a2c8d31b5dc9aba8a57cfefaa72c627901857d61d4693551b803103a WatchSource:0}: Error finding container d3579512a2c8d31b5dc9aba8a57cfefaa72c627901857d61d4693551b803103a: Status 404 returned error can't find the container with id d3579512a2c8d31b5dc9aba8a57cfefaa72c627901857d61d4693551b803103a Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.143461 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.144339 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.148228 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.148475 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.154864 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.311554 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vmrpg"] Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.312026 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/868f4103-f3d2-40ca-871b-ba292ec15557-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.313215 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/868f4103-f3d2-40ca-871b-ba292ec15557-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.331656 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv"] Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.414215 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/868f4103-f3d2-40ca-871b-ba292ec15557-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.414295 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/868f4103-f3d2-40ca-871b-ba292ec15557-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.414366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/868f4103-f3d2-40ca-871b-ba292ec15557-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.421983 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2ab95418730cb47c1db78518a07867c0e49bd9b78c041cd4f3bfe794736f892d"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.423558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" event={"ID":"08786ca5-a181-435b-88e6-1c6369f88eb0","Type":"ContainerStarted","Data":"278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.423594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" event={"ID":"08786ca5-a181-435b-88e6-1c6369f88eb0","Type":"ContainerStarted","Data":"d3579512a2c8d31b5dc9aba8a57cfefaa72c627901857d61d4693551b803103a"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.423673 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" podUID="08786ca5-a181-435b-88e6-1c6369f88eb0" containerName="controller-manager" containerID="cri-o://278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991" gracePeriod=30 Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.423939 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.429256 4722 generic.go:334] "Generic (PLEG): container finished" podID="a72d6495-480f-419e-8b34-b02106e7e279" containerID="e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585" exitCode=0 Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.429464 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerDied","Data":"e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.439875 4722 generic.go:334] "Generic (PLEG): container finished" podID="a3b9b627-4b55-435b-b34e-bda24686f969" containerID="8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac" exitCode=0 Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.439953 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ftb6" event={"ID":"a3b9b627-4b55-435b-b34e-bda24686f969","Type":"ContainerDied","Data":"8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.447520 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.447903 4722 generic.go:334] "Generic (PLEG): container finished" podID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerID="01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9" exitCode=0 Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.447965 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2llb2" event={"ID":"4610ca54-dc80-47ad-b90f-61dffe47a076","Type":"ContainerDied","Data":"01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.452795 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/868f4103-f3d2-40ca-871b-ba292ec15557-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.456810 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" event={"ID":"452039e5-ebab-456a-8ca8-045fa1b1c90a","Type":"ContainerStarted","Data":"fb06b6a4a4e3e22645700d3309b4c72bcd90ed6360064e58d65677c1d2426349"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.458935 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" event={"ID":"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96","Type":"ContainerStarted","Data":"b0dd8c9fc66eb2e279f3d792e54db5f1a72add39fda738cccb9fd665cdf8ca24"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.463358 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerStarted","Data":"822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.466994 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1b41c66b0b68435c039bb85479314f442f50f2d41f17005b3bdf34a81be9ad71"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.467065 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.469118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1f33e6b4788afeaee9b56d2674e812adfb2895a6a9cdd26f4d93d3e289bd5f1a"} Feb 26 19:58:17 crc kubenswrapper[4722]: E0226 19:58:17.470192 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jpsrd" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" Feb 26 19:58:17 crc kubenswrapper[4722]: E0226 19:58:17.471937 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fn7tr" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.483368 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" podStartSLOduration=21.483352601 podStartE2EDuration="21.483352601s" podCreationTimestamp="2026-02-26 19:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:17.480939722 +0000 UTC m=+240.017907656" watchObservedRunningTime="2026-02-26 19:58:17.483352601 +0000 UTC m=+240.020320515" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.501958 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.639496 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" podStartSLOduration=11.491153845 podStartE2EDuration="17.63948037s" podCreationTimestamp="2026-02-26 19:58:00 +0000 UTC" firstStartedPulling="2026-02-26 19:58:10.677479457 +0000 UTC m=+233.214447381" lastFinishedPulling="2026-02-26 19:58:16.825805972 +0000 UTC m=+239.362773906" observedRunningTime="2026-02-26 19:58:17.637783502 +0000 UTC m=+240.174751436" watchObservedRunningTime="2026-02-26 19:58:17.63948037 +0000 UTC m=+240.176448294" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.816351 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.838295 4722 csr.go:261] certificate signing request csr-pqh4p is approved, waiting to be issued Feb 26 19:58:17 crc kubenswrapper[4722]: W0226 19:58:17.839849 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod868f4103_f3d2_40ca_871b_ba292ec15557.slice/crio-19ca8cbe185f37a278d682ded5e9c927b539448892401df64bf02498f2d307e6 WatchSource:0}: Error finding container 19ca8cbe185f37a278d682ded5e9c927b539448892401df64bf02498f2d307e6: Status 404 returned error can't find the container with id 19ca8cbe185f37a278d682ded5e9c927b539448892401df64bf02498f2d307e6 Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.846798 4722 csr.go:257] certificate signing request csr-pqh4p is issued Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.438974 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.469472 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c8957d474-wgjp5"] Feb 26 19:58:18 crc kubenswrapper[4722]: E0226 19:58:18.469782 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08786ca5-a181-435b-88e6-1c6369f88eb0" containerName="controller-manager" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.469807 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="08786ca5-a181-435b-88e6-1c6369f88eb0" containerName="controller-manager" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.470009 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="08786ca5-a181-435b-88e6-1c6369f88eb0" containerName="controller-manager" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.470572 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.489405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" event={"ID":"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77","Type":"ContainerStarted","Data":"6197e2d2fbf196323bf8bb9bc314f78b1dab4cfc29d27b1595a6355f935b49f4"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.489447 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" event={"ID":"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77","Type":"ContainerStarted","Data":"7b0661a6e023f6c0b9e8710590cb5e1e6b6021fc127b00d3f79945eb4706862e"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.489578 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" podUID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" containerName="route-controller-manager" containerID="cri-o://6197e2d2fbf196323bf8bb9bc314f78b1dab4cfc29d27b1595a6355f935b49f4" gracePeriod=30 Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.490051 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.502286 4722 generic.go:334] "Generic (PLEG): container finished" podID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerID="822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76" exitCode=0 Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.503585 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c8957d474-wgjp5"] Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.503640 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerDied","Data":"822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.507816 4722 generic.go:334] "Generic (PLEG): container finished" podID="08786ca5-a181-435b-88e6-1c6369f88eb0" containerID="278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991" exitCode=0 Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.507883 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.507889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" event={"ID":"08786ca5-a181-435b-88e6-1c6369f88eb0","Type":"ContainerDied","Data":"278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.508288 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" event={"ID":"08786ca5-a181-435b-88e6-1c6369f88eb0","Type":"ContainerDied","Data":"d3579512a2c8d31b5dc9aba8a57cfefaa72c627901857d61d4693551b803103a"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.508368 4722 scope.go:117] "RemoveContainer" containerID="278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.509741 4722 generic.go:334] "Generic (PLEG): container finished" podID="452039e5-ebab-456a-8ca8-045fa1b1c90a" containerID="fb06b6a4a4e3e22645700d3309b4c72bcd90ed6360064e58d65677c1d2426349" exitCode=0 Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.509800 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" event={"ID":"452039e5-ebab-456a-8ca8-045fa1b1c90a","Type":"ContainerDied","Data":"fb06b6a4a4e3e22645700d3309b4c72bcd90ed6360064e58d65677c1d2426349"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.512361 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"868f4103-f3d2-40ca-871b-ba292ec15557","Type":"ContainerStarted","Data":"b3c3800ac1218825dcbf5f4521a02d29b310e76ef67de8ca5ebdb9b671373c13"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.512388 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"868f4103-f3d2-40ca-871b-ba292ec15557","Type":"ContainerStarted","Data":"19ca8cbe185f37a278d682ded5e9c927b539448892401df64bf02498f2d307e6"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.517145 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" event={"ID":"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96","Type":"ContainerStarted","Data":"c6519d7b2ab6e3996d2fe937d775751f22fc51b9b7dab7931d4e890dfdd9528a"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.517179 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" event={"ID":"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96","Type":"ContainerStarted","Data":"313b99c0b25fb4ab1db89d8fe545f9ea8d35d90b65858608dd3ec077fdc6bb60"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.523928 4722 patch_prober.go:28] interesting pod/route-controller-manager-5b7ff9db7b-mbfvv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:40042->10.217.0.60:8443: read: connection reset by peer" start-of-body= Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.523994 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" podUID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:40042->10.217.0.60:8443: read: connection reset by peer" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.526486 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mqqf\" (UniqueName: \"kubernetes.io/projected/08786ca5-a181-435b-88e6-1c6369f88eb0-kube-api-access-6mqqf\") pod \"08786ca5-a181-435b-88e6-1c6369f88eb0\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.526530 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-proxy-ca-bundles\") pod \"08786ca5-a181-435b-88e6-1c6369f88eb0\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.526603 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-client-ca\") pod \"08786ca5-a181-435b-88e6-1c6369f88eb0\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.526639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08786ca5-a181-435b-88e6-1c6369f88eb0-serving-cert\") pod \"08786ca5-a181-435b-88e6-1c6369f88eb0\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.526675 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-config\") pod \"08786ca5-a181-435b-88e6-1c6369f88eb0\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.527708 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "08786ca5-a181-435b-88e6-1c6369f88eb0" (UID: "08786ca5-a181-435b-88e6-1c6369f88eb0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.527799 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-client-ca" (OuterVolumeSpecName: "client-ca") pod "08786ca5-a181-435b-88e6-1c6369f88eb0" (UID: "08786ca5-a181-435b-88e6-1c6369f88eb0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.527861 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-config" (OuterVolumeSpecName: "config") pod "08786ca5-a181-435b-88e6-1c6369f88eb0" (UID: "08786ca5-a181-435b-88e6-1c6369f88eb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.535885 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08786ca5-a181-435b-88e6-1c6369f88eb0-kube-api-access-6mqqf" (OuterVolumeSpecName: "kube-api-access-6mqqf") pod "08786ca5-a181-435b-88e6-1c6369f88eb0" (UID: "08786ca5-a181-435b-88e6-1c6369f88eb0"). InnerVolumeSpecName "kube-api-access-6mqqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.536647 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08786ca5-a181-435b-88e6-1c6369f88eb0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "08786ca5-a181-435b-88e6-1c6369f88eb0" (UID: "08786ca5-a181-435b-88e6-1c6369f88eb0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.542790 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" podStartSLOduration=22.542774898 podStartE2EDuration="22.542774898s" podCreationTimestamp="2026-02-26 19:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:18.522647206 +0000 UTC m=+241.059615130" watchObservedRunningTime="2026-02-26 19:58:18.542774898 +0000 UTC m=+241.079742822" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.542915 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vmrpg" podStartSLOduration=191.542910692 podStartE2EDuration="3m11.542910692s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:18.534316298 +0000 UTC m=+241.071284232" watchObservedRunningTime="2026-02-26 19:58:18.542910692 +0000 UTC m=+241.079878616" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.583584 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.583565238 podStartE2EDuration="1.583565238s" podCreationTimestamp="2026-02-26 19:58:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:18.578414062 +0000 UTC m=+241.115382006" watchObservedRunningTime="2026-02-26 19:58:18.583565238 +0000 UTC m=+241.120533162" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.627940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-config\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.627991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-proxy-ca-bundles\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628339 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6289e971-979b-46e4-b06d-82c9e9a03a07-serving-cert\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pzjc\" (UniqueName: \"kubernetes.io/projected/6289e971-979b-46e4-b06d-82c9e9a03a07-kube-api-access-6pzjc\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-client-ca\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628554 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mqqf\" (UniqueName: \"kubernetes.io/projected/08786ca5-a181-435b-88e6-1c6369f88eb0-kube-api-access-6mqqf\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628571 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628583 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628592 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08786ca5-a181-435b-88e6-1c6369f88eb0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628601 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.729685 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6289e971-979b-46e4-b06d-82c9e9a03a07-serving-cert\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.729738 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pzjc\" (UniqueName: \"kubernetes.io/projected/6289e971-979b-46e4-b06d-82c9e9a03a07-kube-api-access-6pzjc\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.729762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-client-ca\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.729816 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-config\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.729834 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-proxy-ca-bundles\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.730803 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-proxy-ca-bundles\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.732177 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-config\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.733388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-client-ca\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.734709 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6289e971-979b-46e4-b06d-82c9e9a03a07-serving-cert\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.743829 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pzjc\" (UniqueName: \"kubernetes.io/projected/6289e971-979b-46e4-b06d-82c9e9a03a07-kube-api-access-6pzjc\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.804107 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.832307 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fbd685686-kdh5s"] Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.834827 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7fbd685686-kdh5s"] Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.848202 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-30 12:22:10.040098157 +0000 UTC Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.848232 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7360h23m51.191868922s for next certificate rotation Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.068612 4722 scope.go:117] "RemoveContainer" containerID="278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991" Feb 26 19:58:19 crc kubenswrapper[4722]: E0226 19:58:19.069003 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991\": container with ID starting with 278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991 not found: ID does not exist" containerID="278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.069037 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991"} err="failed to get container status \"278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991\": rpc error: code = NotFound desc = could not find container \"278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991\": container with ID starting with 278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991 not found: ID does not exist" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.479361 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c8957d474-wgjp5"] Feb 26 19:58:19 crc kubenswrapper[4722]: W0226 19:58:19.483796 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6289e971_979b_46e4_b06d_82c9e9a03a07.slice/crio-c41cafd7fd03b5c28e183c8bf10a0149d5f7dd1009f20d70ece1b11bd7082677 WatchSource:0}: Error finding container c41cafd7fd03b5c28e183c8bf10a0149d5f7dd1009f20d70ece1b11bd7082677: Status 404 returned error can't find the container with id c41cafd7fd03b5c28e183c8bf10a0149d5f7dd1009f20d70ece1b11bd7082677 Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.522772 4722 generic.go:334] "Generic (PLEG): container finished" podID="868f4103-f3d2-40ca-871b-ba292ec15557" containerID="b3c3800ac1218825dcbf5f4521a02d29b310e76ef67de8ca5ebdb9b671373c13" exitCode=0 Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.522840 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"868f4103-f3d2-40ca-871b-ba292ec15557","Type":"ContainerDied","Data":"b3c3800ac1218825dcbf5f4521a02d29b310e76ef67de8ca5ebdb9b671373c13"} Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.525589 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5b7ff9db7b-mbfvv_b5e74ce4-0c23-4b52-bc0a-eddf9d742b77/route-controller-manager/0.log" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.525645 4722 generic.go:334] "Generic (PLEG): container finished" podID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" containerID="6197e2d2fbf196323bf8bb9bc314f78b1dab4cfc29d27b1595a6355f935b49f4" exitCode=255 Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.525714 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" event={"ID":"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77","Type":"ContainerDied","Data":"6197e2d2fbf196323bf8bb9bc314f78b1dab4cfc29d27b1595a6355f935b49f4"} Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.526966 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" event={"ID":"6289e971-979b-46e4-b06d-82c9e9a03a07","Type":"ContainerStarted","Data":"c41cafd7fd03b5c28e183c8bf10a0149d5f7dd1009f20d70ece1b11bd7082677"} Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.795008 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.799632 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5b7ff9db7b-mbfvv_b5e74ce4-0c23-4b52-bc0a-eddf9d742b77/route-controller-manager/0.log" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.799706 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.848458 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-23 08:32:59.752677315 +0000 UTC Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.848496 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6468h34m39.904183895s for next certificate rotation Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.944244 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx4ll\" (UniqueName: \"kubernetes.io/projected/452039e5-ebab-456a-8ca8-045fa1b1c90a-kube-api-access-dx4ll\") pod \"452039e5-ebab-456a-8ca8-045fa1b1c90a\" (UID: \"452039e5-ebab-456a-8ca8-045fa1b1c90a\") " Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.944616 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvgc8\" (UniqueName: \"kubernetes.io/projected/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-kube-api-access-jvgc8\") pod \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.944671 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-config\") pod \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.944692 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-serving-cert\") pod \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.944711 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-client-ca\") pod \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.945461 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-client-ca" (OuterVolumeSpecName: "client-ca") pod "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" (UID: "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.945469 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-config" (OuterVolumeSpecName: "config") pod "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" (UID: "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.953261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" (UID: "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.953345 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452039e5-ebab-456a-8ca8-045fa1b1c90a-kube-api-access-dx4ll" (OuterVolumeSpecName: "kube-api-access-dx4ll") pod "452039e5-ebab-456a-8ca8-045fa1b1c90a" (UID: "452039e5-ebab-456a-8ca8-045fa1b1c90a"). InnerVolumeSpecName "kube-api-access-dx4ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.953494 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-kube-api-access-jvgc8" (OuterVolumeSpecName: "kube-api-access-jvgc8") pod "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" (UID: "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77"). InnerVolumeSpecName "kube-api-access-jvgc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.045665 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx4ll\" (UniqueName: \"kubernetes.io/projected/452039e5-ebab-456a-8ca8-045fa1b1c90a-kube-api-access-dx4ll\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.045706 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvgc8\" (UniqueName: \"kubernetes.io/projected/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-kube-api-access-jvgc8\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.045717 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.045726 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.045735 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.153690 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08786ca5-a181-435b-88e6-1c6369f88eb0" path="/var/lib/kubelet/pods/08786ca5-a181-435b-88e6-1c6369f88eb0/volumes" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.534601 4722 generic.go:334] "Generic (PLEG): container finished" podID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" containerID="038d57052d50b4d9f98e827126cdbdf049580d5bca8e9f8a10f570e84904b7ef" exitCode=0 Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.534647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" event={"ID":"7c96e488-8450-4dff-ac4c-5ac9e210a9a6","Type":"ContainerDied","Data":"038d57052d50b4d9f98e827126cdbdf049580d5bca8e9f8a10f570e84904b7ef"} Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.537286 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" event={"ID":"6289e971-979b-46e4-b06d-82c9e9a03a07","Type":"ContainerStarted","Data":"918d921ec67de984541ccbc36c92bd8a0479884cebc1d2f1f49d3f19edac9246"} Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.537647 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.538942 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5b7ff9db7b-mbfvv_b5e74ce4-0c23-4b52-bc0a-eddf9d742b77/route-controller-manager/0.log" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.539074 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.539115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" event={"ID":"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77","Type":"ContainerDied","Data":"7b0661a6e023f6c0b9e8710590cb5e1e6b6021fc127b00d3f79945eb4706862e"} Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.539164 4722 scope.go:117] "RemoveContainer" containerID="6197e2d2fbf196323bf8bb9bc314f78b1dab4cfc29d27b1595a6355f935b49f4" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.541546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" event={"ID":"452039e5-ebab-456a-8ca8-045fa1b1c90a","Type":"ContainerDied","Data":"b8a25b0a8ec7712d3382eea466d8636d54b458d957cf438370008ad7c1ad98e9"} Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.541577 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8a25b0a8ec7712d3382eea466d8636d54b458d957cf438370008ad7c1ad98e9" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.541589 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.546347 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.566627 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" podStartSLOduration=4.566609491 podStartE2EDuration="4.566609491s" podCreationTimestamp="2026-02-26 19:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:20.562527405 +0000 UTC m=+243.099495329" watchObservedRunningTime="2026-02-26 19:58:20.566609491 +0000 UTC m=+243.103577415" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.586238 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv"] Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.590721 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv"] Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.079376 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.124364 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk"] Feb 26 19:58:21 crc kubenswrapper[4722]: E0226 19:58:21.126578 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" containerName="route-controller-manager" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.126724 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" containerName="route-controller-manager" Feb 26 19:58:21 crc kubenswrapper[4722]: E0226 19:58:21.126909 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452039e5-ebab-456a-8ca8-045fa1b1c90a" containerName="oc" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.126973 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="452039e5-ebab-456a-8ca8-045fa1b1c90a" containerName="oc" Feb 26 19:58:21 crc kubenswrapper[4722]: E0226 19:58:21.126980 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868f4103-f3d2-40ca-871b-ba292ec15557" containerName="pruner" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.126987 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="868f4103-f3d2-40ca-871b-ba292ec15557" containerName="pruner" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.127336 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="868f4103-f3d2-40ca-871b-ba292ec15557" containerName="pruner" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.127355 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="452039e5-ebab-456a-8ca8-045fa1b1c90a" containerName="oc" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.127364 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" containerName="route-controller-manager" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.127855 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.132914 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.132997 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.133233 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.133271 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.133970 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.134228 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.138463 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk"] Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.159486 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/868f4103-f3d2-40ca-871b-ba292ec15557-kube-api-access\") pod \"868f4103-f3d2-40ca-871b-ba292ec15557\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.159569 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/868f4103-f3d2-40ca-871b-ba292ec15557-kubelet-dir\") pod \"868f4103-f3d2-40ca-871b-ba292ec15557\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.159717 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/868f4103-f3d2-40ca-871b-ba292ec15557-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "868f4103-f3d2-40ca-871b-ba292ec15557" (UID: "868f4103-f3d2-40ca-871b-ba292ec15557"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.159984 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/868f4103-f3d2-40ca-871b-ba292ec15557-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.164604 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868f4103-f3d2-40ca-871b-ba292ec15557-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "868f4103-f3d2-40ca-871b-ba292ec15557" (UID: "868f4103-f3d2-40ca-871b-ba292ec15557"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.261107 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-config\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.261176 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwvc2\" (UniqueName: \"kubernetes.io/projected/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-kube-api-access-bwvc2\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.261202 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-serving-cert\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.261330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-client-ca\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.261401 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/868f4103-f3d2-40ca-871b-ba292ec15557-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.362889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwvc2\" (UniqueName: \"kubernetes.io/projected/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-kube-api-access-bwvc2\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.362944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-serving-cert\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.362986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-client-ca\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.363056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-config\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.364302 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-config\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.364575 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-client-ca\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.367117 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-serving-cert\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.381475 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwvc2\" (UniqueName: \"kubernetes.io/projected/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-kube-api-access-bwvc2\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.510009 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.554060 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ftb6" event={"ID":"a3b9b627-4b55-435b-b34e-bda24686f969","Type":"ContainerStarted","Data":"be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff"} Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.559067 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2llb2" event={"ID":"4610ca54-dc80-47ad-b90f-61dffe47a076","Type":"ContainerStarted","Data":"c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62"} Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.560915 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerStarted","Data":"9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63"} Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.562587 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.562577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"868f4103-f3d2-40ca-871b-ba292ec15557","Type":"ContainerDied","Data":"19ca8cbe185f37a278d682ded5e9c927b539448892401df64bf02498f2d307e6"} Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.562721 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19ca8cbe185f37a278d682ded5e9c927b539448892401df64bf02498f2d307e6" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.572266 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7ftb6" podStartSLOduration=2.242983814 podStartE2EDuration="40.572250159s" podCreationTimestamp="2026-02-26 19:57:41 +0000 UTC" firstStartedPulling="2026-02-26 19:57:42.743027487 +0000 UTC m=+205.279995421" lastFinishedPulling="2026-02-26 19:58:21.072293842 +0000 UTC m=+243.609261766" observedRunningTime="2026-02-26 19:58:21.568238745 +0000 UTC m=+244.105206679" watchObservedRunningTime="2026-02-26 19:58:21.572250159 +0000 UTC m=+244.109218083" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.589879 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mr2wq" podStartSLOduration=3.233761856 podStartE2EDuration="42.589862571s" podCreationTimestamp="2026-02-26 19:57:39 +0000 UTC" firstStartedPulling="2026-02-26 19:57:41.68960319 +0000 UTC m=+204.226571104" lastFinishedPulling="2026-02-26 19:58:21.045703895 +0000 UTC m=+243.582671819" observedRunningTime="2026-02-26 19:58:21.587551435 +0000 UTC m=+244.124519359" watchObservedRunningTime="2026-02-26 19:58:21.589862571 +0000 UTC m=+244.126830495" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.604953 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2llb2" podStartSLOduration=1.962285616 podStartE2EDuration="42.604937779s" podCreationTimestamp="2026-02-26 19:57:39 +0000 UTC" firstStartedPulling="2026-02-26 19:57:40.525819404 +0000 UTC m=+203.062787328" lastFinishedPulling="2026-02-26 19:58:21.168471567 +0000 UTC m=+243.705439491" observedRunningTime="2026-02-26 19:58:21.603998023 +0000 UTC m=+244.140965947" watchObservedRunningTime="2026-02-26 19:58:21.604937779 +0000 UTC m=+244.141905703" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.886539 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.904726 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.904772 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.917356 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk"] Feb 26 19:58:21 crc kubenswrapper[4722]: W0226 19:58:21.929942 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89b4c107_9bf3_4fa0_8c0f_b1bac20d4ac8.slice/crio-37dcd6d67b8476079fc9bd367f00543f4280b65b5d793807d7dbad2958dab0dd WatchSource:0}: Error finding container 37dcd6d67b8476079fc9bd367f00543f4280b65b5d793807d7dbad2958dab0dd: Status 404 returned error can't find the container with id 37dcd6d67b8476079fc9bd367f00543f4280b65b5d793807d7dbad2958dab0dd Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.973328 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4dtd\" (UniqueName: \"kubernetes.io/projected/7c96e488-8450-4dff-ac4c-5ac9e210a9a6-kube-api-access-b4dtd\") pod \"7c96e488-8450-4dff-ac4c-5ac9e210a9a6\" (UID: \"7c96e488-8450-4dff-ac4c-5ac9e210a9a6\") " Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.978581 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c96e488-8450-4dff-ac4c-5ac9e210a9a6-kube-api-access-b4dtd" (OuterVolumeSpecName: "kube-api-access-b4dtd") pod "7c96e488-8450-4dff-ac4c-5ac9e210a9a6" (UID: "7c96e488-8450-4dff-ac4c-5ac9e210a9a6"). InnerVolumeSpecName "kube-api-access-b4dtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.074946 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4dtd\" (UniqueName: \"kubernetes.io/projected/7c96e488-8450-4dff-ac4c-5ac9e210a9a6-kube-api-access-b4dtd\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.152333 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" path="/var/lib/kubelet/pods/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77/volumes" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.568058 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerStarted","Data":"8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2"} Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.570083 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" event={"ID":"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8","Type":"ContainerStarted","Data":"60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b"} Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.570474 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.570567 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" event={"ID":"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8","Type":"ContainerStarted","Data":"37dcd6d67b8476079fc9bd367f00543f4280b65b5d793807d7dbad2958dab0dd"} Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.572180 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.571992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" event={"ID":"7c96e488-8450-4dff-ac4c-5ac9e210a9a6","Type":"ContainerDied","Data":"2f1e553263d89e01672f4f975fb65eb928586c467285d829440d70c715e53b87"} Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.572311 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f1e553263d89e01672f4f975fb65eb928586c467285d829440d70c715e53b87" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.576947 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.587039 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p4qbc" podStartSLOduration=3.998521815 podStartE2EDuration="40.587018297s" podCreationTimestamp="2026-02-26 19:57:42 +0000 UTC" firstStartedPulling="2026-02-26 19:57:44.835166802 +0000 UTC m=+207.372134726" lastFinishedPulling="2026-02-26 19:58:21.423663284 +0000 UTC m=+243.960631208" observedRunningTime="2026-02-26 19:58:22.585163264 +0000 UTC m=+245.122131198" watchObservedRunningTime="2026-02-26 19:58:22.587018297 +0000 UTC m=+245.123986221" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.610773 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" podStartSLOduration=6.610748562 podStartE2EDuration="6.610748562s" podCreationTimestamp="2026-02-26 19:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:22.609637061 +0000 UTC m=+245.146604985" watchObservedRunningTime="2026-02-26 19:58:22.610748562 +0000 UTC m=+245.147716486" Feb 26 19:58:23 crc kubenswrapper[4722]: I0226 19:58:23.046740 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-7ftb6" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="registry-server" probeResult="failure" output=< Feb 26 19:58:23 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 19:58:23 crc kubenswrapper[4722]: > Feb 26 19:58:23 crc kubenswrapper[4722]: I0226 19:58:23.311946 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:58:23 crc kubenswrapper[4722]: I0226 19:58:23.312564 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:58:23 crc kubenswrapper[4722]: I0226 19:58:23.487852 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 19:58:23 crc kubenswrapper[4722]: I0226 19:58:23.488208 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.350661 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p4qbc" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="registry-server" probeResult="failure" output=< Feb 26 19:58:24 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 19:58:24 crc kubenswrapper[4722]: > Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.537709 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 19:58:24 crc kubenswrapper[4722]: E0226 19:58:24.538146 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" containerName="oc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.538163 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" containerName="oc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.538262 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" containerName="oc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.538588 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.543179 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.543569 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.557590 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.712794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d27a2962-12b7-476f-a95f-b4f161165950-kube-api-access\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.712835 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-var-lock\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.712883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.814155 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d27a2962-12b7-476f-a95f-b4f161165950-kube-api-access\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.814202 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-var-lock\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.814243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.814350 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.814568 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-var-lock\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:25 crc kubenswrapper[4722]: I0226 19:58:25.230008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d27a2962-12b7-476f-a95f-b4f161165950-kube-api-access\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:25 crc kubenswrapper[4722]: I0226 19:58:25.456731 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:25 crc kubenswrapper[4722]: I0226 19:58:25.865369 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 19:58:26 crc kubenswrapper[4722]: I0226 19:58:26.593002 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d27a2962-12b7-476f-a95f-b4f161165950","Type":"ContainerStarted","Data":"989ba51223d7de6ef648a2f2ca97103dec29ef669ec6f86d0075e4bf2e005f62"} Feb 26 19:58:26 crc kubenswrapper[4722]: I0226 19:58:26.593300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d27a2962-12b7-476f-a95f-b4f161165950","Type":"ContainerStarted","Data":"4717c38566dd6b128e72b9141d50bd648be04c82879e0e2c6cf583dc317f62d1"} Feb 26 19:58:27 crc kubenswrapper[4722]: I0226 19:58:27.163817 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.16379749 podStartE2EDuration="3.16379749s" podCreationTimestamp="2026-02-26 19:58:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:26.606528082 +0000 UTC m=+249.143496006" watchObservedRunningTime="2026-02-26 19:58:27.16379749 +0000 UTC m=+249.700765414" Feb 26 19:58:27 crc kubenswrapper[4722]: I0226 19:58:27.599458 4722 generic.go:334] "Generic (PLEG): container finished" podID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerID="ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71" exitCode=0 Feb 26 19:58:27 crc kubenswrapper[4722]: I0226 19:58:27.599561 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vmx6" event={"ID":"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02","Type":"ContainerDied","Data":"ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71"} Feb 26 19:58:28 crc kubenswrapper[4722]: I0226 19:58:28.608171 4722 generic.go:334] "Generic (PLEG): container finished" podID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerID="e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65" exitCode=0 Feb 26 19:58:28 crc kubenswrapper[4722]: I0226 19:58:28.608187 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbwt" event={"ID":"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e","Type":"ContainerDied","Data":"e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65"} Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.543875 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.544230 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.594254 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.621071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbwt" event={"ID":"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e","Type":"ContainerStarted","Data":"22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093"} Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.624554 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vmx6" event={"ID":"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02","Type":"ContainerStarted","Data":"7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6"} Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.638974 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jxbwt" podStartSLOduration=2.3618946259999998 podStartE2EDuration="48.638959388s" podCreationTimestamp="2026-02-26 19:57:41 +0000 UTC" firstStartedPulling="2026-02-26 19:57:42.712275412 +0000 UTC m=+205.249243336" lastFinishedPulling="2026-02-26 19:58:28.989340134 +0000 UTC m=+251.526308098" observedRunningTime="2026-02-26 19:58:29.63621834 +0000 UTC m=+252.173186274" watchObservedRunningTime="2026-02-26 19:58:29.638959388 +0000 UTC m=+252.175927312" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.659131 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9vmx6" podStartSLOduration=3.84335928 podStartE2EDuration="50.659112941s" podCreationTimestamp="2026-02-26 19:57:39 +0000 UTC" firstStartedPulling="2026-02-26 19:57:41.69383555 +0000 UTC m=+204.230803474" lastFinishedPulling="2026-02-26 19:58:28.509589171 +0000 UTC m=+251.046557135" observedRunningTime="2026-02-26 19:58:29.658154384 +0000 UTC m=+252.195122318" watchObservedRunningTime="2026-02-26 19:58:29.659112941 +0000 UTC m=+252.196080865" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.666244 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.960072 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.960152 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:58:30 crc kubenswrapper[4722]: I0226 19:58:30.007127 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:58:30 crc kubenswrapper[4722]: I0226 19:58:30.143357 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:58:30 crc kubenswrapper[4722]: I0226 19:58:30.143409 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:58:30 crc kubenswrapper[4722]: I0226 19:58:30.674645 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.176356 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9vmx6" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="registry-server" probeResult="failure" output=< Feb 26 19:58:31 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 19:58:31 crc kubenswrapper[4722]: > Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.496568 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.496621 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.534963 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.687016 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mr2wq"] Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.953052 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.999826 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:58:32 crc kubenswrapper[4722]: I0226 19:58:32.538304 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8dztn"] Feb 26 19:58:32 crc kubenswrapper[4722]: I0226 19:58:32.644408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerStarted","Data":"8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333"} Feb 26 19:58:32 crc kubenswrapper[4722]: I0226 19:58:32.646038 4722 generic.go:334] "Generic (PLEG): container finished" podID="94176c67-3742-4347-83c8-d467d4eb6be7" containerID="8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5" exitCode=0 Feb 26 19:58:32 crc kubenswrapper[4722]: I0226 19:58:32.646082 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsrd" event={"ID":"94176c67-3742-4347-83c8-d467d4eb6be7","Type":"ContainerDied","Data":"8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5"} Feb 26 19:58:32 crc kubenswrapper[4722]: I0226 19:58:32.646293 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mr2wq" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="registry-server" containerID="cri-o://9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63" gracePeriod=2 Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.077040 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.157421 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-utilities\") pod \"a72d6495-480f-419e-8b34-b02106e7e279\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.158601 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-utilities" (OuterVolumeSpecName: "utilities") pod "a72d6495-480f-419e-8b34-b02106e7e279" (UID: "a72d6495-480f-419e-8b34-b02106e7e279"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.258286 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-catalog-content\") pod \"a72d6495-480f-419e-8b34-b02106e7e279\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.258402 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9q4b\" (UniqueName: \"kubernetes.io/projected/a72d6495-480f-419e-8b34-b02106e7e279-kube-api-access-l9q4b\") pod \"a72d6495-480f-419e-8b34-b02106e7e279\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.258582 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.263536 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72d6495-480f-419e-8b34-b02106e7e279-kube-api-access-l9q4b" (OuterVolumeSpecName: "kube-api-access-l9q4b") pod "a72d6495-480f-419e-8b34-b02106e7e279" (UID: "a72d6495-480f-419e-8b34-b02106e7e279"). InnerVolumeSpecName "kube-api-access-l9q4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.306839 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a72d6495-480f-419e-8b34-b02106e7e279" (UID: "a72d6495-480f-419e-8b34-b02106e7e279"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.347819 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.359613 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9q4b\" (UniqueName: \"kubernetes.io/projected/a72d6495-480f-419e-8b34-b02106e7e279-kube-api-access-l9q4b\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.359637 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.383621 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.652265 4722 generic.go:334] "Generic (PLEG): container finished" podID="a72d6495-480f-419e-8b34-b02106e7e279" containerID="9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63" exitCode=0 Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.652343 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerDied","Data":"9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63"} Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.652376 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.652382 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerDied","Data":"0dcf0c8eeb875944efbe43c423613d539f2d5a1406933217df05b755f6b605eb"} Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.652404 4722 scope.go:117] "RemoveContainer" containerID="9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.654920 4722 generic.go:334] "Generic (PLEG): container finished" podID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerID="8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333" exitCode=0 Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.654991 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerDied","Data":"8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333"} Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.688627 4722 scope.go:117] "RemoveContainer" containerID="e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.704240 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mr2wq"] Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.708923 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mr2wq"] Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.718573 4722 scope.go:117] "RemoveContainer" containerID="da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.740925 4722 scope.go:117] "RemoveContainer" containerID="9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63" Feb 26 19:58:33 crc kubenswrapper[4722]: E0226 19:58:33.741402 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63\": container with ID starting with 9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63 not found: ID does not exist" containerID="9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.741430 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63"} err="failed to get container status \"9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63\": rpc error: code = NotFound desc = could not find container \"9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63\": container with ID starting with 9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63 not found: ID does not exist" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.741451 4722 scope.go:117] "RemoveContainer" containerID="e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585" Feb 26 19:58:33 crc kubenswrapper[4722]: E0226 19:58:33.741797 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585\": container with ID starting with e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585 not found: ID does not exist" containerID="e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.741849 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585"} err="failed to get container status \"e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585\": rpc error: code = NotFound desc = could not find container \"e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585\": container with ID starting with e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585 not found: ID does not exist" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.741885 4722 scope.go:117] "RemoveContainer" containerID="da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9" Feb 26 19:58:33 crc kubenswrapper[4722]: E0226 19:58:33.742208 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9\": container with ID starting with da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9 not found: ID does not exist" containerID="da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.742241 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9"} err="failed to get container status \"da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9\": rpc error: code = NotFound desc = could not find container \"da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9\": container with ID starting with da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9 not found: ID does not exist" Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.074242 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p4qbc"] Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.153095 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a72d6495-480f-419e-8b34-b02106e7e279" path="/var/lib/kubelet/pods/a72d6495-480f-419e-8b34-b02106e7e279/volumes" Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.661363 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsrd" event={"ID":"94176c67-3742-4347-83c8-d467d4eb6be7","Type":"ContainerStarted","Data":"6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9"} Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.664347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerStarted","Data":"a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e"} Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.664503 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p4qbc" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="registry-server" containerID="cri-o://8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2" gracePeriod=2 Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.702275 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fn7tr" podStartSLOduration=2.500940838 podStartE2EDuration="52.702243546s" podCreationTimestamp="2026-02-26 19:57:42 +0000 UTC" firstStartedPulling="2026-02-26 19:57:43.853973649 +0000 UTC m=+206.390941573" lastFinishedPulling="2026-02-26 19:58:34.055276227 +0000 UTC m=+256.592244281" observedRunningTime="2026-02-26 19:58:34.699369624 +0000 UTC m=+257.236337578" watchObservedRunningTime="2026-02-26 19:58:34.702243546 +0000 UTC m=+257.239211530" Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.707196 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jpsrd" podStartSLOduration=2.717726899 podStartE2EDuration="55.707139525s" podCreationTimestamp="2026-02-26 19:57:39 +0000 UTC" firstStartedPulling="2026-02-26 19:57:40.556099695 +0000 UTC m=+203.093067619" lastFinishedPulling="2026-02-26 19:58:33.545512321 +0000 UTC m=+256.082480245" observedRunningTime="2026-02-26 19:58:34.680843718 +0000 UTC m=+257.217811662" watchObservedRunningTime="2026-02-26 19:58:34.707139525 +0000 UTC m=+257.244107489" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.147060 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.281468 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-utilities\") pod \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.281541 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-catalog-content\") pod \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.281628 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfg4l\" (UniqueName: \"kubernetes.io/projected/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-kube-api-access-kfg4l\") pod \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.282404 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-utilities" (OuterVolumeSpecName: "utilities") pod "b6f1a3bb-e878-47a7-9740-a8a4012eba8d" (UID: "b6f1a3bb-e878-47a7-9740-a8a4012eba8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.298420 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-kube-api-access-kfg4l" (OuterVolumeSpecName: "kube-api-access-kfg4l") pod "b6f1a3bb-e878-47a7-9740-a8a4012eba8d" (UID: "b6f1a3bb-e878-47a7-9740-a8a4012eba8d"). InnerVolumeSpecName "kube-api-access-kfg4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.383376 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfg4l\" (UniqueName: \"kubernetes.io/projected/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-kube-api-access-kfg4l\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.383413 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.392782 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6f1a3bb-e878-47a7-9740-a8a4012eba8d" (UID: "b6f1a3bb-e878-47a7-9740-a8a4012eba8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.485287 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.672003 4722 generic.go:334] "Generic (PLEG): container finished" podID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerID="8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2" exitCode=0 Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.672054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerDied","Data":"8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2"} Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.672090 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.672113 4722 scope.go:117] "RemoveContainer" containerID="8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.672100 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerDied","Data":"6640be0fb17f1e8ff94ba19db8f3f2a3eb0875cbbe9514eebc261458ec3bff56"} Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.691199 4722 scope.go:117] "RemoveContainer" containerID="822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.706077 4722 scope.go:117] "RemoveContainer" containerID="a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.715662 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p4qbc"] Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.724716 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p4qbc"] Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.725385 4722 scope.go:117] "RemoveContainer" containerID="8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2" Feb 26 19:58:35 crc kubenswrapper[4722]: E0226 19:58:35.725778 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2\": container with ID starting with 8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2 not found: ID does not exist" containerID="8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.725807 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2"} err="failed to get container status \"8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2\": rpc error: code = NotFound desc = could not find container \"8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2\": container with ID starting with 8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2 not found: ID does not exist" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.725828 4722 scope.go:117] "RemoveContainer" containerID="822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76" Feb 26 19:58:35 crc kubenswrapper[4722]: E0226 19:58:35.726038 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76\": container with ID starting with 822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76 not found: ID does not exist" containerID="822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.726081 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76"} err="failed to get container status \"822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76\": rpc error: code = NotFound desc = could not find container \"822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76\": container with ID starting with 822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76 not found: ID does not exist" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.726111 4722 scope.go:117] "RemoveContainer" containerID="a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b" Feb 26 19:58:35 crc kubenswrapper[4722]: E0226 19:58:35.726405 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b\": container with ID starting with a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b not found: ID does not exist" containerID="a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.726434 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b"} err="failed to get container status \"a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b\": rpc error: code = NotFound desc = could not find container \"a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b\": container with ID starting with a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b not found: ID does not exist" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.874971 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ftb6"] Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.875208 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7ftb6" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="registry-server" containerID="cri-o://be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff" gracePeriod=2 Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.163238 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" path="/var/lib/kubelet/pods/b6f1a3bb-e878-47a7-9740-a8a4012eba8d/volumes" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.173692 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c8957d474-wgjp5"] Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.173942 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" podUID="6289e971-979b-46e4-b06d-82c9e9a03a07" containerName="controller-manager" containerID="cri-o://918d921ec67de984541ccbc36c92bd8a0479884cebc1d2f1f49d3f19edac9246" gracePeriod=30 Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.182073 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk"] Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.182328 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" podUID="89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" containerName="route-controller-manager" containerID="cri-o://60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b" gracePeriod=30 Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.373197 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.500619 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-257cl\" (UniqueName: \"kubernetes.io/projected/a3b9b627-4b55-435b-b34e-bda24686f969-kube-api-access-257cl\") pod \"a3b9b627-4b55-435b-b34e-bda24686f969\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.501163 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-utilities\") pod \"a3b9b627-4b55-435b-b34e-bda24686f969\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.501250 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-catalog-content\") pod \"a3b9b627-4b55-435b-b34e-bda24686f969\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.502281 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-utilities" (OuterVolumeSpecName: "utilities") pod "a3b9b627-4b55-435b-b34e-bda24686f969" (UID: "a3b9b627-4b55-435b-b34e-bda24686f969"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.510457 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b9b627-4b55-435b-b34e-bda24686f969-kube-api-access-257cl" (OuterVolumeSpecName: "kube-api-access-257cl") pod "a3b9b627-4b55-435b-b34e-bda24686f969" (UID: "a3b9b627-4b55-435b-b34e-bda24686f969"). InnerVolumeSpecName "kube-api-access-257cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.527832 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3b9b627-4b55-435b-b34e-bda24686f969" (UID: "a3b9b627-4b55-435b-b34e-bda24686f969"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.601828 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.601864 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-257cl\" (UniqueName: \"kubernetes.io/projected/a3b9b627-4b55-435b-b34e-bda24686f969-kube-api-access-257cl\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.601879 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.607363 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.688340 4722 generic.go:334] "Generic (PLEG): container finished" podID="89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" containerID="60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b" exitCode=0 Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.688396 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" event={"ID":"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8","Type":"ContainerDied","Data":"60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b"} Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.688418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" event={"ID":"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8","Type":"ContainerDied","Data":"37dcd6d67b8476079fc9bd367f00543f4280b65b5d793807d7dbad2958dab0dd"} Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.688432 4722 scope.go:117] "RemoveContainer" containerID="60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.688521 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.692015 4722 generic.go:334] "Generic (PLEG): container finished" podID="a3b9b627-4b55-435b-b34e-bda24686f969" containerID="be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff" exitCode=0 Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.692044 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ftb6" event={"ID":"a3b9b627-4b55-435b-b34e-bda24686f969","Type":"ContainerDied","Data":"be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff"} Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.692072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ftb6" event={"ID":"a3b9b627-4b55-435b-b34e-bda24686f969","Type":"ContainerDied","Data":"7cd3e66f2bc98cc3227423922b3b895dc9b1d5f28702f947c7a56274ae80cae2"} Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.692095 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.693771 4722 generic.go:334] "Generic (PLEG): container finished" podID="6289e971-979b-46e4-b06d-82c9e9a03a07" containerID="918d921ec67de984541ccbc36c92bd8a0479884cebc1d2f1f49d3f19edac9246" exitCode=0 Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.693806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" event={"ID":"6289e971-979b-46e4-b06d-82c9e9a03a07","Type":"ContainerDied","Data":"918d921ec67de984541ccbc36c92bd8a0479884cebc1d2f1f49d3f19edac9246"} Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.701341 4722 scope.go:117] "RemoveContainer" containerID="60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b" Feb 26 19:58:36 crc kubenswrapper[4722]: E0226 19:58:36.701740 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b\": container with ID starting with 60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b not found: ID does not exist" containerID="60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.701770 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b"} err="failed to get container status \"60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b\": rpc error: code = NotFound desc = could not find container \"60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b\": container with ID starting with 60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b not found: ID does not exist" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.701787 4722 scope.go:117] "RemoveContainer" containerID="be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.702364 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-serving-cert\") pod \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.702413 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-client-ca\") pod \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.702472 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwvc2\" (UniqueName: \"kubernetes.io/projected/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-kube-api-access-bwvc2\") pod \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.702501 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-config\") pod \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.703223 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-client-ca" (OuterVolumeSpecName: "client-ca") pod "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" (UID: "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.703264 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-config" (OuterVolumeSpecName: "config") pod "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" (UID: "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.705815 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" (UID: "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.708093 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-kube-api-access-bwvc2" (OuterVolumeSpecName: "kube-api-access-bwvc2") pod "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" (UID: "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8"). InnerVolumeSpecName "kube-api-access-bwvc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.720339 4722 scope.go:117] "RemoveContainer" containerID="8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.720605 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ftb6"] Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.723342 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ftb6"] Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.735893 4722 scope.go:117] "RemoveContainer" containerID="1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.737367 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.749041 4722 scope.go:117] "RemoveContainer" containerID="be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff" Feb 26 19:58:36 crc kubenswrapper[4722]: E0226 19:58:36.749474 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff\": container with ID starting with be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff not found: ID does not exist" containerID="be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.749506 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff"} err="failed to get container status \"be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff\": rpc error: code = NotFound desc = could not find container \"be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff\": container with ID starting with be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff not found: ID does not exist" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.749524 4722 scope.go:117] "RemoveContainer" containerID="8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac" Feb 26 19:58:36 crc kubenswrapper[4722]: E0226 19:58:36.750370 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac\": container with ID starting with 8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac not found: ID does not exist" containerID="8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.750392 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac"} err="failed to get container status \"8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac\": rpc error: code = NotFound desc = could not find container \"8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac\": container with ID starting with 8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac not found: ID does not exist" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.750405 4722 scope.go:117] "RemoveContainer" containerID="1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178" Feb 26 19:58:36 crc kubenswrapper[4722]: E0226 19:58:36.756633 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178\": container with ID starting with 1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178 not found: ID does not exist" containerID="1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.756683 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178"} err="failed to get container status \"1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178\": rpc error: code = NotFound desc = could not find container \"1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178\": container with ID starting with 1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178 not found: ID does not exist" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.803704 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwvc2\" (UniqueName: \"kubernetes.io/projected/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-kube-api-access-bwvc2\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.803761 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.803773 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.803784 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.905134 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pzjc\" (UniqueName: \"kubernetes.io/projected/6289e971-979b-46e4-b06d-82c9e9a03a07-kube-api-access-6pzjc\") pod \"6289e971-979b-46e4-b06d-82c9e9a03a07\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.905757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-config\") pod \"6289e971-979b-46e4-b06d-82c9e9a03a07\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.905810 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-client-ca\") pod \"6289e971-979b-46e4-b06d-82c9e9a03a07\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.905889 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-proxy-ca-bundles\") pod \"6289e971-979b-46e4-b06d-82c9e9a03a07\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.905992 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6289e971-979b-46e4-b06d-82c9e9a03a07-serving-cert\") pod \"6289e971-979b-46e4-b06d-82c9e9a03a07\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.906508 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-client-ca" (OuterVolumeSpecName: "client-ca") pod "6289e971-979b-46e4-b06d-82c9e9a03a07" (UID: "6289e971-979b-46e4-b06d-82c9e9a03a07"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.906768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6289e971-979b-46e4-b06d-82c9e9a03a07" (UID: "6289e971-979b-46e4-b06d-82c9e9a03a07"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.906889 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-config" (OuterVolumeSpecName: "config") pod "6289e971-979b-46e4-b06d-82c9e9a03a07" (UID: "6289e971-979b-46e4-b06d-82c9e9a03a07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.910287 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6289e971-979b-46e4-b06d-82c9e9a03a07-kube-api-access-6pzjc" (OuterVolumeSpecName: "kube-api-access-6pzjc") pod "6289e971-979b-46e4-b06d-82c9e9a03a07" (UID: "6289e971-979b-46e4-b06d-82c9e9a03a07"). InnerVolumeSpecName "kube-api-access-6pzjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.912248 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6289e971-979b-46e4-b06d-82c9e9a03a07-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6289e971-979b-46e4-b06d-82c9e9a03a07" (UID: "6289e971-979b-46e4-b06d-82c9e9a03a07"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.007158 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6289e971-979b-46e4-b06d-82c9e9a03a07-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.007184 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pzjc\" (UniqueName: \"kubernetes.io/projected/6289e971-979b-46e4-b06d-82c9e9a03a07-kube-api-access-6pzjc\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.007196 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.007212 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.007220 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.010417 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk"] Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.016841 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk"] Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.701251 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" event={"ID":"6289e971-979b-46e4-b06d-82c9e9a03a07","Type":"ContainerDied","Data":"c41cafd7fd03b5c28e183c8bf10a0149d5f7dd1009f20d70ece1b11bd7082677"} Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.701296 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.701314 4722 scope.go:117] "RemoveContainer" containerID="918d921ec67de984541ccbc36c92bd8a0479884cebc1d2f1f49d3f19edac9246" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.725949 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c8957d474-wgjp5"] Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.729762 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c8957d474-wgjp5"] Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130313 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf"] Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130704 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="extract-content" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130735 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="extract-content" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130758 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="extract-utilities" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130775 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="extract-utilities" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130793 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" containerName="route-controller-manager" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130811 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" containerName="route-controller-manager" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130859 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130876 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130899 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="extract-utilities" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130914 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="extract-utilities" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130939 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130955 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130978 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="extract-content" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130993 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="extract-content" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.131013 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="extract-content" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131028 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="extract-content" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.131052 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="extract-utilities" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131071 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="extract-utilities" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.131095 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131110 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.131179 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6289e971-979b-46e4-b06d-82c9e9a03a07" containerName="controller-manager" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131201 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6289e971-979b-46e4-b06d-82c9e9a03a07" containerName="controller-manager" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131428 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" containerName="route-controller-manager" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131465 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131484 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131509 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131536 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6289e971-979b-46e4-b06d-82c9e9a03a07" containerName="controller-manager" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.132257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.132511 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56649dfb78-7wknf"] Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.133359 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139165 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139193 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139243 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139178 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139312 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139385 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139605 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139845 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139963 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.143168 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.143555 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.143856 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.147789 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.156949 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6289e971-979b-46e4-b06d-82c9e9a03a07" path="/var/lib/kubelet/pods/6289e971-979b-46e4-b06d-82c9e9a03a07/volumes" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.157726 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" path="/var/lib/kubelet/pods/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8/volumes" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.158234 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" path="/var/lib/kubelet/pods/a3b9b627-4b55-435b-b34e-bda24686f969/volumes" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.159661 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf"] Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.159693 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56649dfb78-7wknf"] Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220402 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3dcc73-c386-4b09-a111-e705939eabbd-serving-cert\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220448 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-serving-cert\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220475 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w9f9\" (UniqueName: \"kubernetes.io/projected/7b3dcc73-c386-4b09-a111-e705939eabbd-kube-api-access-2w9f9\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220500 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-proxy-ca-bundles\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220533 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-client-ca\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220553 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm6z7\" (UniqueName: \"kubernetes.io/projected/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-kube-api-access-sm6z7\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-client-ca\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-config\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220640 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-config\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321502 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3dcc73-c386-4b09-a111-e705939eabbd-serving-cert\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-serving-cert\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w9f9\" (UniqueName: \"kubernetes.io/projected/7b3dcc73-c386-4b09-a111-e705939eabbd-kube-api-access-2w9f9\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321616 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-proxy-ca-bundles\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-client-ca\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321681 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm6z7\" (UniqueName: \"kubernetes.io/projected/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-kube-api-access-sm6z7\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321713 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-client-ca\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321756 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-config\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321779 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-config\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.323617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-config\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.323765 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-client-ca\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.329596 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-proxy-ca-bundles\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.330637 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-client-ca\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.340735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3dcc73-c386-4b09-a111-e705939eabbd-serving-cert\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.345462 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w9f9\" (UniqueName: \"kubernetes.io/projected/7b3dcc73-c386-4b09-a111-e705939eabbd-kube-api-access-2w9f9\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.345476 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-config\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.351661 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-serving-cert\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.364998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm6z7\" (UniqueName: \"kubernetes.io/projected/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-kube-api-access-sm6z7\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.456628 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.466501 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.719792 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf"] Feb 26 19:58:38 crc kubenswrapper[4722]: W0226 19:58:38.724384 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fa4bde9_9700_412f_a78d_73c2eb6fbc68.slice/crio-4c49fd53ace010141096535f822cd6917122dd3870a0c1deb33686168d496c4c WatchSource:0}: Error finding container 4c49fd53ace010141096535f822cd6917122dd3870a0c1deb33686168d496c4c: Status 404 returned error can't find the container with id 4c49fd53ace010141096535f822cd6917122dd3870a0c1deb33686168d496c4c Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.867337 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56649dfb78-7wknf"] Feb 26 19:58:38 crc kubenswrapper[4722]: W0226 19:58:38.880309 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b3dcc73_c386_4b09_a111_e705939eabbd.slice/crio-5f937928ed7f8b57d8d6b5e4ff1ab7fb24155c9b61aa90b05dab19a720ba3578 WatchSource:0}: Error finding container 5f937928ed7f8b57d8d6b5e4ff1ab7fb24155c9b61aa90b05dab19a720ba3578: Status 404 returned error can't find the container with id 5f937928ed7f8b57d8d6b5e4ff1ab7fb24155c9b61aa90b05dab19a720ba3578 Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.708796 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.709177 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.715658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" event={"ID":"3fa4bde9-9700-412f-a78d-73c2eb6fbc68","Type":"ContainerStarted","Data":"fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576"} Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.715722 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" event={"ID":"3fa4bde9-9700-412f-a78d-73c2eb6fbc68","Type":"ContainerStarted","Data":"4c49fd53ace010141096535f822cd6917122dd3870a0c1deb33686168d496c4c"} Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.716870 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" event={"ID":"7b3dcc73-c386-4b09-a111-e705939eabbd","Type":"ContainerStarted","Data":"59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d"} Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.716919 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" event={"ID":"7b3dcc73-c386-4b09-a111-e705939eabbd","Type":"ContainerStarted","Data":"5f937928ed7f8b57d8d6b5e4ff1ab7fb24155c9b61aa90b05dab19a720ba3578"} Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.744235 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.177013 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.220810 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.720928 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.725359 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.738230 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" podStartSLOduration=4.738211093 podStartE2EDuration="4.738211093s" podCreationTimestamp="2026-02-26 19:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:40.7359606 +0000 UTC m=+263.272928544" watchObservedRunningTime="2026-02-26 19:58:40.738211093 +0000 UTC m=+263.275179027" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.757408 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" podStartSLOduration=4.757391078 podStartE2EDuration="4.757391078s" podCreationTimestamp="2026-02-26 19:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:40.756367089 +0000 UTC m=+263.293335023" watchObservedRunningTime="2026-02-26 19:58:40.757391078 +0000 UTC m=+263.294359002" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.766192 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:58:41 crc kubenswrapper[4722]: I0226 19:58:41.273487 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9vmx6"] Feb 26 19:58:41 crc kubenswrapper[4722]: I0226 19:58:41.533375 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:58:41 crc kubenswrapper[4722]: I0226 19:58:41.726773 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9vmx6" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="registry-server" containerID="cri-o://7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6" gracePeriod=2 Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.082062 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.267422 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-catalog-content\") pod \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.267489 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwmck\" (UniqueName: \"kubernetes.io/projected/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-kube-api-access-jwmck\") pod \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.267559 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-utilities\") pod \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.270541 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-utilities" (OuterVolumeSpecName: "utilities") pod "ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" (UID: "ed54be4f-7a1d-4cf9-b7cc-9b7265667c02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.273413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-kube-api-access-jwmck" (OuterVolumeSpecName: "kube-api-access-jwmck") pod "ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" (UID: "ed54be4f-7a1d-4cf9-b7cc-9b7265667c02"). InnerVolumeSpecName "kube-api-access-jwmck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.335285 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" (UID: "ed54be4f-7a1d-4cf9-b7cc-9b7265667c02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.370380 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.370434 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.370456 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwmck\" (UniqueName: \"kubernetes.io/projected/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-kube-api-access-jwmck\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.747545 4722 generic.go:334] "Generic (PLEG): container finished" podID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerID="7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6" exitCode=0 Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.747598 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vmx6" event={"ID":"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02","Type":"ContainerDied","Data":"7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6"} Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.747668 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vmx6" event={"ID":"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02","Type":"ContainerDied","Data":"1f7453cc19a5d54403cb0b3196ee6b07ae90acbffa8047c31afc0b1cc8f528a8"} Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.747699 4722 scope.go:117] "RemoveContainer" containerID="7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.747619 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.765798 4722 scope.go:117] "RemoveContainer" containerID="ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.786059 4722 scope.go:117] "RemoveContainer" containerID="92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.799870 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9vmx6"] Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.806813 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9vmx6"] Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.810178 4722 scope.go:117] "RemoveContainer" containerID="7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6" Feb 26 19:58:42 crc kubenswrapper[4722]: E0226 19:58:42.810664 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6\": container with ID starting with 7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6 not found: ID does not exist" containerID="7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.810719 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6"} err="failed to get container status \"7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6\": rpc error: code = NotFound desc = could not find container \"7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6\": container with ID starting with 7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6 not found: ID does not exist" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.810752 4722 scope.go:117] "RemoveContainer" containerID="ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71" Feb 26 19:58:42 crc kubenswrapper[4722]: E0226 19:58:42.811219 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71\": container with ID starting with ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71 not found: ID does not exist" containerID="ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.811252 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71"} err="failed to get container status \"ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71\": rpc error: code = NotFound desc = could not find container \"ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71\": container with ID starting with ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71 not found: ID does not exist" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.811273 4722 scope.go:117] "RemoveContainer" containerID="92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638" Feb 26 19:58:42 crc kubenswrapper[4722]: E0226 19:58:42.811672 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638\": container with ID starting with 92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638 not found: ID does not exist" containerID="92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.811703 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638"} err="failed to get container status \"92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638\": rpc error: code = NotFound desc = could not find container \"92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638\": container with ID starting with 92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638 not found: ID does not exist" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.896972 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.897011 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.945636 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:58:43 crc kubenswrapper[4722]: I0226 19:58:43.811005 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:58:44 crc kubenswrapper[4722]: I0226 19:58:44.155694 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" path="/var/lib/kubelet/pods/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02/volumes" Feb 26 19:58:47 crc kubenswrapper[4722]: I0226 19:58:47.392294 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:58:48 crc kubenswrapper[4722]: I0226 19:58:48.457395 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:48 crc kubenswrapper[4722]: I0226 19:58:48.467302 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.487238 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.487539 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.487596 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.488109 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.488179 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e" gracePeriod=600 Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.808966 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e" exitCode=0 Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.809079 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e"} Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.809442 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"82183f43647e7ff3a4f2ec342cd25b593cbee0369ff7a2ece2747f71f5ba2d03"} Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.158790 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56649dfb78-7wknf"] Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.160953 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" podUID="7b3dcc73-c386-4b09-a111-e705939eabbd" containerName="controller-manager" containerID="cri-o://59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d" gracePeriod=30 Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.237311 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf"] Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.237524 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" podUID="3fa4bde9-9700-412f-a78d-73c2eb6fbc68" containerName="route-controller-manager" containerID="cri-o://fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576" gracePeriod=30 Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.709863 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.772703 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.831783 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.831808 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" event={"ID":"3fa4bde9-9700-412f-a78d-73c2eb6fbc68","Type":"ContainerDied","Data":"fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576"} Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.831894 4722 scope.go:117] "RemoveContainer" containerID="fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.831768 4722 generic.go:334] "Generic (PLEG): container finished" podID="3fa4bde9-9700-412f-a78d-73c2eb6fbc68" containerID="fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576" exitCode=0 Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.832077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" event={"ID":"3fa4bde9-9700-412f-a78d-73c2eb6fbc68","Type":"ContainerDied","Data":"4c49fd53ace010141096535f822cd6917122dd3870a0c1deb33686168d496c4c"} Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.835533 4722 generic.go:334] "Generic (PLEG): container finished" podID="7b3dcc73-c386-4b09-a111-e705939eabbd" containerID="59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d" exitCode=0 Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.835574 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.835573 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" event={"ID":"7b3dcc73-c386-4b09-a111-e705939eabbd","Type":"ContainerDied","Data":"59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d"} Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.835742 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" event={"ID":"7b3dcc73-c386-4b09-a111-e705939eabbd","Type":"ContainerDied","Data":"5f937928ed7f8b57d8d6b5e4ff1ab7fb24155c9b61aa90b05dab19a720ba3578"} Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.853413 4722 scope.go:117] "RemoveContainer" containerID="fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576" Feb 26 19:58:56 crc kubenswrapper[4722]: E0226 19:58:56.853858 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576\": container with ID starting with fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576 not found: ID does not exist" containerID="fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.853918 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576"} err="failed to get container status \"fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576\": rpc error: code = NotFound desc = could not find container \"fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576\": container with ID starting with fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576 not found: ID does not exist" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.853953 4722 scope.go:117] "RemoveContainer" containerID="59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.860897 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-client-ca\") pod \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.860998 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-serving-cert\") pod \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.861035 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm6z7\" (UniqueName: \"kubernetes.io/projected/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-kube-api-access-sm6z7\") pod \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.861082 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-config\") pod \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.862240 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-client-ca" (OuterVolumeSpecName: "client-ca") pod "3fa4bde9-9700-412f-a78d-73c2eb6fbc68" (UID: "3fa4bde9-9700-412f-a78d-73c2eb6fbc68"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.862326 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-config" (OuterVolumeSpecName: "config") pod "3fa4bde9-9700-412f-a78d-73c2eb6fbc68" (UID: "3fa4bde9-9700-412f-a78d-73c2eb6fbc68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.867179 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-kube-api-access-sm6z7" (OuterVolumeSpecName: "kube-api-access-sm6z7") pod "3fa4bde9-9700-412f-a78d-73c2eb6fbc68" (UID: "3fa4bde9-9700-412f-a78d-73c2eb6fbc68"). InnerVolumeSpecName "kube-api-access-sm6z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.868534 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3fa4bde9-9700-412f-a78d-73c2eb6fbc68" (UID: "3fa4bde9-9700-412f-a78d-73c2eb6fbc68"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.871779 4722 scope.go:117] "RemoveContainer" containerID="59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d" Feb 26 19:58:56 crc kubenswrapper[4722]: E0226 19:58:56.873042 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d\": container with ID starting with 59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d not found: ID does not exist" containerID="59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.873085 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d"} err="failed to get container status \"59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d\": rpc error: code = NotFound desc = could not find container \"59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d\": container with ID starting with 59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d not found: ID does not exist" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962011 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3dcc73-c386-4b09-a111-e705939eabbd-serving-cert\") pod \"7b3dcc73-c386-4b09-a111-e705939eabbd\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962111 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9f9\" (UniqueName: \"kubernetes.io/projected/7b3dcc73-c386-4b09-a111-e705939eabbd-kube-api-access-2w9f9\") pod \"7b3dcc73-c386-4b09-a111-e705939eabbd\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962153 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-config\") pod \"7b3dcc73-c386-4b09-a111-e705939eabbd\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962171 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-client-ca\") pod \"7b3dcc73-c386-4b09-a111-e705939eabbd\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962198 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-proxy-ca-bundles\") pod \"7b3dcc73-c386-4b09-a111-e705939eabbd\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962383 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962395 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm6z7\" (UniqueName: \"kubernetes.io/projected/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-kube-api-access-sm6z7\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962410 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962418 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962778 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-client-ca" (OuterVolumeSpecName: "client-ca") pod "7b3dcc73-c386-4b09-a111-e705939eabbd" (UID: "7b3dcc73-c386-4b09-a111-e705939eabbd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962826 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7b3dcc73-c386-4b09-a111-e705939eabbd" (UID: "7b3dcc73-c386-4b09-a111-e705939eabbd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962933 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-config" (OuterVolumeSpecName: "config") pod "7b3dcc73-c386-4b09-a111-e705939eabbd" (UID: "7b3dcc73-c386-4b09-a111-e705939eabbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.965237 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3dcc73-c386-4b09-a111-e705939eabbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7b3dcc73-c386-4b09-a111-e705939eabbd" (UID: "7b3dcc73-c386-4b09-a111-e705939eabbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.965555 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3dcc73-c386-4b09-a111-e705939eabbd-kube-api-access-2w9f9" (OuterVolumeSpecName: "kube-api-access-2w9f9") pod "7b3dcc73-c386-4b09-a111-e705939eabbd" (UID: "7b3dcc73-c386-4b09-a111-e705939eabbd"). InnerVolumeSpecName "kube-api-access-2w9f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.063876 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9f9\" (UniqueName: \"kubernetes.io/projected/7b3dcc73-c386-4b09-a111-e705939eabbd-kube-api-access-2w9f9\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.063917 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.063928 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.063939 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.063948 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3dcc73-c386-4b09-a111-e705939eabbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.164493 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56649dfb78-7wknf"] Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.175286 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56649dfb78-7wknf"] Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.183089 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf"] Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.187470 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf"] Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.572928 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" podUID="fd936901-7dc0-416a-8ac6-8305c72d65ba" containerName="oauth-openshift" containerID="cri-o://742b5c5ffe257d1d9783d658dc3b6b1076163264902ddffa577e2b0751bf51f0" gracePeriod=15 Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.842073 4722 generic.go:334] "Generic (PLEG): container finished" podID="fd936901-7dc0-416a-8ac6-8305c72d65ba" containerID="742b5c5ffe257d1d9783d658dc3b6b1076163264902ddffa577e2b0751bf51f0" exitCode=0 Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.842150 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" event={"ID":"fd936901-7dc0-416a-8ac6-8305c72d65ba","Type":"ContainerDied","Data":"742b5c5ffe257d1d9783d658dc3b6b1076163264902ddffa577e2b0751bf51f0"} Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.930909 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.074873 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-router-certs\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.074932 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-serving-cert\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.074952 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-login\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.074982 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-session\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075003 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-error\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075025 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-policies\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075051 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-provider-selection\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075091 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqrpd\" (UniqueName: \"kubernetes.io/projected/fd936901-7dc0-416a-8ac6-8305c72d65ba-kube-api-access-nqrpd\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075113 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-trusted-ca-bundle\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075180 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-cliconfig\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075219 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-service-ca\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075238 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-dir\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075259 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-idp-0-file-data\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075907 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-ocp-branding-template\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075965 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076010 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076002 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076431 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076522 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076536 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076546 4722 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076556 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076565 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.080537 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.080876 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd936901-7dc0-416a-8ac6-8305c72d65ba-kube-api-access-nqrpd" (OuterVolumeSpecName: "kube-api-access-nqrpd") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "kube-api-access-nqrpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.081564 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.081909 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.082224 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.082601 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.084230 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.085490 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.088405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.160395 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fa4bde9-9700-412f-a78d-73c2eb6fbc68" path="/var/lib/kubelet/pods/3fa4bde9-9700-412f-a78d-73c2eb6fbc68/volumes" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.161600 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3dcc73-c386-4b09-a111-e705939eabbd" path="/var/lib/kubelet/pods/7b3dcc73-c386-4b09-a111-e705939eabbd/volumes" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162244 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-676764cf6-szbgk"] Feb 26 19:58:58 crc kubenswrapper[4722]: E0226 19:58:58.162462 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="extract-content" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162474 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="extract-content" Feb 26 19:58:58 crc kubenswrapper[4722]: E0226 19:58:58.162488 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3dcc73-c386-4b09-a111-e705939eabbd" containerName="controller-manager" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162495 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3dcc73-c386-4b09-a111-e705939eabbd" containerName="controller-manager" Feb 26 19:58:58 crc kubenswrapper[4722]: E0226 19:58:58.162510 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="extract-utilities" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162516 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="extract-utilities" Feb 26 19:58:58 crc kubenswrapper[4722]: E0226 19:58:58.162527 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="registry-server" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162533 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="registry-server" Feb 26 19:58:58 crc kubenswrapper[4722]: E0226 19:58:58.162542 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd936901-7dc0-416a-8ac6-8305c72d65ba" containerName="oauth-openshift" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162547 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd936901-7dc0-416a-8ac6-8305c72d65ba" containerName="oauth-openshift" Feb 26 19:58:58 crc kubenswrapper[4722]: E0226 19:58:58.162556 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa4bde9-9700-412f-a78d-73c2eb6fbc68" containerName="route-controller-manager" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162563 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa4bde9-9700-412f-a78d-73c2eb6fbc68" containerName="route-controller-manager" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162669 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b3dcc73-c386-4b09-a111-e705939eabbd" containerName="controller-manager" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162685 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="registry-server" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162695 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd936901-7dc0-416a-8ac6-8305c72d65ba" containerName="oauth-openshift" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162703 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa4bde9-9700-412f-a78d-73c2eb6fbc68" containerName="route-controller-manager" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.163281 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.163454 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.163682 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.163901 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.165388 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-676764cf6-szbgk"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.165628 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.170923 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171125 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171197 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171154 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171412 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171536 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171597 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171368 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171372 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.172461 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.173718 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.174323 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.176771 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177356 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177377 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177388 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177398 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177408 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177419 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177428 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177439 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177448 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqrpd\" (UniqueName: \"kubernetes.io/projected/fd936901-7dc0-416a-8ac6-8305c72d65ba-kube-api-access-nqrpd\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.178634 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.179486 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279037 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-error\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279108 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279190 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-serving-cert\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5z7\" (UniqueName: \"kubernetes.io/projected/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-kube-api-access-pg5z7\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279354 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-router-certs\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw4hl\" (UniqueName: \"kubernetes.io/projected/1f7277c7-02c4-4338-92ca-4408b71c2db6-kube-api-access-zw4hl\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279494 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-audit-policies\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-session\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279574 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279611 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-proxy-ca-bundles\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279752 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279788 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncwcq\" (UniqueName: \"kubernetes.io/projected/f2219255-4e92-4960-853c-3f92afcb30ae-kube-api-access-ncwcq\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279817 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f7277c7-02c4-4338-92ca-4408b71c2db6-audit-dir\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279848 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279882 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-config\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279915 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-config\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279989 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-login\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.280026 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2219255-4e92-4960-853c-3f92afcb30ae-serving-cert\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.280060 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-client-ca\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.280100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-service-ca\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.280158 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-client-ca\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.380852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-router-certs\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.380926 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4hl\" (UniqueName: \"kubernetes.io/projected/1f7277c7-02c4-4338-92ca-4408b71c2db6-kube-api-access-zw4hl\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.380966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-audit-policies\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.380994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-session\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.381030 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-proxy-ca-bundles\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.381060 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382348 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncwcq\" (UniqueName: \"kubernetes.io/projected/f2219255-4e92-4960-853c-3f92afcb30ae-kube-api-access-ncwcq\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382477 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f7277c7-02c4-4338-92ca-4408b71c2db6-audit-dir\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382542 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-audit-policies\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382562 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-proxy-ca-bundles\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382567 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-config\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382791 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-config\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383056 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383163 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-login\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383236 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2219255-4e92-4960-853c-3f92afcb30ae-serving-cert\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383280 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-client-ca\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-service-ca\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383380 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-client-ca\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-error\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.384375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.384515 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5z7\" (UniqueName: \"kubernetes.io/projected/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-kube-api-access-pg5z7\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.384558 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-serving-cert\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.384680 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-client-ca\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.384759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f7277c7-02c4-4338-92ca-4408b71c2db6-audit-dir\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.385031 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.385131 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-config\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.386022 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-service-ca\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.386681 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-config\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.386928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-client-ca\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.387777 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.387771 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2219255-4e92-4960-853c-3f92afcb30ae-serving-cert\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.388024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.388113 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.390069 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-login\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.390194 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-error\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.390635 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-session\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.391527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-serving-cert\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.394018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-router-certs\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.394427 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.396246 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw4hl\" (UniqueName: \"kubernetes.io/projected/1f7277c7-02c4-4338-92ca-4408b71c2db6-kube-api-access-zw4hl\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.398039 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncwcq\" (UniqueName: \"kubernetes.io/projected/f2219255-4e92-4960-853c-3f92afcb30ae-kube-api-access-ncwcq\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.405619 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5z7\" (UniqueName: \"kubernetes.io/projected/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-kube-api-access-pg5z7\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.495077 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.505682 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.520082 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.710479 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-676764cf6-szbgk"] Feb 26 19:58:58 crc kubenswrapper[4722]: W0226 19:58:58.717500 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2219255_4e92_4960_853c_3f92afcb30ae.slice/crio-73f00605e0c16c1fe154e91ad424d66f013bd6fbcc30b1acb1b22f56e1f07f73 WatchSource:0}: Error finding container 73f00605e0c16c1fe154e91ad424d66f013bd6fbcc30b1acb1b22f56e1f07f73: Status 404 returned error can't find the container with id 73f00605e0c16c1fe154e91ad424d66f013bd6fbcc30b1acb1b22f56e1f07f73 Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.852500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" event={"ID":"f2219255-4e92-4960-853c-3f92afcb30ae","Type":"ContainerStarted","Data":"acad4bb5ad4e6f7c8ccced243e763e5328a3bc411cfddb9720355db4f1077e73"} Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.852559 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" event={"ID":"f2219255-4e92-4960-853c-3f92afcb30ae","Type":"ContainerStarted","Data":"73f00605e0c16c1fe154e91ad424d66f013bd6fbcc30b1acb1b22f56e1f07f73"} Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.852846 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.854532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" event={"ID":"fd936901-7dc0-416a-8ac6-8305c72d65ba","Type":"ContainerDied","Data":"276f96c20b112e49a7e22df1751b734dd6c8d0b22d1debea8e9f0abd1d77f1fb"} Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.854582 4722 scope.go:117] "RemoveContainer" containerID="742b5c5ffe257d1d9783d658dc3b6b1076163264902ddffa577e2b0751bf51f0" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.854597 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.855288 4722 patch_prober.go:28] interesting pod/controller-manager-676764cf6-szbgk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.855364 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" podUID="f2219255-4e92-4960-853c-3f92afcb30ae" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.872719 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" podStartSLOduration=2.872699775 podStartE2EDuration="2.872699775s" podCreationTimestamp="2026-02-26 19:58:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:58.870875165 +0000 UTC m=+281.407843089" watchObservedRunningTime="2026-02-26 19:58:58.872699775 +0000 UTC m=+281.409667709" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.892930 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8dztn"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.897302 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8dztn"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.996389 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7"] Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.009206 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5"] Feb 26 19:58:59 crc kubenswrapper[4722]: W0226 19:58:59.009989 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7277c7_02c4_4338_92ca_4408b71c2db6.slice/crio-6b7700e7cc0b0f17671273b168bcfd90673bf1cfd561c9b0ded3ce10a3138f9b WatchSource:0}: Error finding container 6b7700e7cc0b0f17671273b168bcfd90673bf1cfd561c9b0ded3ce10a3138f9b: Status 404 returned error can't find the container with id 6b7700e7cc0b0f17671273b168bcfd90673bf1cfd561c9b0ded3ce10a3138f9b Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.860961 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" event={"ID":"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4","Type":"ContainerStarted","Data":"f91dc88170c6de886fe7a1322ae979d1a58a67167cbf81b34bf7ebe46d1fe854"} Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.861350 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.861365 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" event={"ID":"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4","Type":"ContainerStarted","Data":"4b8ac1fd056b64ba6a9dcc1da977f11764d782b935d3163d8df60ff5cd010366"} Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.864452 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" event={"ID":"1f7277c7-02c4-4338-92ca-4408b71c2db6","Type":"ContainerStarted","Data":"10b2af52355b76a6db30a45bf83ad4b390c101d71d2418de92c228f1ce553422"} Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.864506 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" event={"ID":"1f7277c7-02c4-4338-92ca-4408b71c2db6","Type":"ContainerStarted","Data":"6b7700e7cc0b0f17671273b168bcfd90673bf1cfd561c9b0ded3ce10a3138f9b"} Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.865706 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.868827 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.870661 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.880896 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" podStartSLOduration=3.880874723 podStartE2EDuration="3.880874723s" podCreationTimestamp="2026-02-26 19:58:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:59.877663033 +0000 UTC m=+282.414630967" watchObservedRunningTime="2026-02-26 19:58:59.880874723 +0000 UTC m=+282.417842667" Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.913656 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.942066 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" podStartSLOduration=27.942046439 podStartE2EDuration="27.942046439s" podCreationTimestamp="2026-02-26 19:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:59.939299772 +0000 UTC m=+282.476267716" watchObservedRunningTime="2026-02-26 19:58:59.942046439 +0000 UTC m=+282.479014363" Feb 26 19:59:00 crc kubenswrapper[4722]: I0226 19:59:00.151855 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd936901-7dc0-416a-8ac6-8305c72d65ba" path="/var/lib/kubelet/pods/fd936901-7dc0-416a-8ac6-8305c72d65ba/volumes" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.806966 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.807778 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.808178 4722 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.808688 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20" gracePeriod=15 Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.808795 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698" gracePeriod=15 Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.808732 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017" gracePeriod=15 Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.808912 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594" gracePeriod=15 Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.808827 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336" gracePeriod=15 Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.811631 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.812052 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.812277 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.812374 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.812449 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.812535 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.812634 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.812719 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.812907 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.813028 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.813194 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.813347 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.813538 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.813668 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.813812 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.813945 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.814064 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.814210 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.814358 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.814652 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.814795 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.814918 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.815089 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.815288 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.815461 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.815602 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.815937 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.816312 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.816641 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.817124 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.847384 4722 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.958372 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.958774 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.958832 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.958877 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.958922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.959018 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.959424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.959544 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060602 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060736 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060767 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060812 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060784 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060839 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060767 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060950 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060990 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.061022 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.061065 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.061178 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.061212 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.061233 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.061263 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.148664 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: W0226 19:59:04.175507 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-55befe72536d6c8a36fa6fd4a724d2bba2e2c593327ff2ede3d33e61eb8d969c WatchSource:0}: Error finding container 55befe72536d6c8a36fa6fd4a724d2bba2e2c593327ff2ede3d33e61eb8d969c: Status 404 returned error can't find the container with id 55befe72536d6c8a36fa6fd4a724d2bba2e2c593327ff2ede3d33e61eb8d969c Feb 26 19:59:04 crc kubenswrapper[4722]: E0226 19:59:04.179435 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897e4408675655e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:59:04.178459998 +0000 UTC m=+286.715427952,LastTimestamp:2026-02-26 19:59:04.178459998 +0000 UTC m=+286.715427952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.900978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583"} Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.901082 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"55befe72536d6c8a36fa6fd4a724d2bba2e2c593327ff2ede3d33e61eb8d969c"} Feb 26 19:59:04 crc kubenswrapper[4722]: E0226 19:59:04.902100 4722 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.905648 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.908484 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.909489 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017" exitCode=0 Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.909545 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698" exitCode=0 Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.909568 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594" exitCode=0 Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.909587 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336" exitCode=2 Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.909713 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.914308 4722 generic.go:334] "Generic (PLEG): container finished" podID="d27a2962-12b7-476f-a95f-b4f161165950" containerID="989ba51223d7de6ef648a2f2ca97103dec29ef669ec6f86d0075e4bf2e005f62" exitCode=0 Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.914396 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d27a2962-12b7-476f-a95f-b4f161165950","Type":"ContainerDied","Data":"989ba51223d7de6ef648a2f2ca97103dec29ef669ec6f86d0075e4bf2e005f62"} Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.915594 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:05 crc kubenswrapper[4722]: I0226 19:59:05.928755 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.178513 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.179676 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.180607 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.181015 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.277506 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.278121 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.278671 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290273 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290420 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290593 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d27a2962-12b7-476f-a95f-b4f161165950-kube-api-access\") pod \"d27a2962-12b7-476f-a95f-b4f161165950\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290629 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-kubelet-dir\") pod \"d27a2962-12b7-476f-a95f-b4f161165950\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290661 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290709 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-var-lock\") pod \"d27a2962-12b7-476f-a95f-b4f161165950\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290795 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d27a2962-12b7-476f-a95f-b4f161165950" (UID: "d27a2962-12b7-476f-a95f-b4f161165950"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290854 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-var-lock" (OuterVolumeSpecName: "var-lock") pod "d27a2962-12b7-476f-a95f-b4f161165950" (UID: "d27a2962-12b7-476f-a95f-b4f161165950"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.291088 4722 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.291124 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.291187 4722 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.291212 4722 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.291236 4722 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-var-lock\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.296072 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d27a2962-12b7-476f-a95f-b4f161165950-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d27a2962-12b7-476f-a95f-b4f161165950" (UID: "d27a2962-12b7-476f-a95f-b4f161165950"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.391694 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d27a2962-12b7-476f-a95f-b4f161165950-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.938177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d27a2962-12b7-476f-a95f-b4f161165950","Type":"ContainerDied","Data":"4717c38566dd6b128e72b9141d50bd648be04c82879e0e2c6cf583dc317f62d1"} Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.938236 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4717c38566dd6b128e72b9141d50bd648be04c82879e0e2c6cf583dc317f62d1" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.938206 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.941419 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.944168 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20" exitCode=0 Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.944228 4722 scope.go:117] "RemoveContainer" containerID="4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.944346 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.967896 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.968786 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.970812 4722 scope.go:117] "RemoveContainer" containerID="ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.982393 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.982761 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.988983 4722 scope.go:117] "RemoveContainer" containerID="af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.021571 4722 scope.go:117] "RemoveContainer" containerID="3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.034603 4722 scope.go:117] "RemoveContainer" containerID="db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.051277 4722 scope.go:117] "RemoveContainer" containerID="2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.074617 4722 scope.go:117] "RemoveContainer" containerID="4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017" Feb 26 19:59:07 crc kubenswrapper[4722]: E0226 19:59:07.075178 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\": container with ID starting with 4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017 not found: ID does not exist" containerID="4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.075241 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017"} err="failed to get container status \"4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\": rpc error: code = NotFound desc = could not find container \"4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\": container with ID starting with 4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017 not found: ID does not exist" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.075294 4722 scope.go:117] "RemoveContainer" containerID="ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698" Feb 26 19:59:07 crc kubenswrapper[4722]: E0226 19:59:07.075632 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\": container with ID starting with ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698 not found: ID does not exist" containerID="ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.075662 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698"} err="failed to get container status \"ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\": rpc error: code = NotFound desc = could not find container \"ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\": container with ID starting with ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698 not found: ID does not exist" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.075704 4722 scope.go:117] "RemoveContainer" containerID="af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594" Feb 26 19:59:07 crc kubenswrapper[4722]: E0226 19:59:07.076006 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\": container with ID starting with af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594 not found: ID does not exist" containerID="af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.076092 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594"} err="failed to get container status \"af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\": rpc error: code = NotFound desc = could not find container \"af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\": container with ID starting with af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594 not found: ID does not exist" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.076172 4722 scope.go:117] "RemoveContainer" containerID="3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336" Feb 26 19:59:07 crc kubenswrapper[4722]: E0226 19:59:07.076592 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\": container with ID starting with 3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336 not found: ID does not exist" containerID="3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.076621 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336"} err="failed to get container status \"3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\": rpc error: code = NotFound desc = could not find container \"3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\": container with ID starting with 3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336 not found: ID does not exist" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.076642 4722 scope.go:117] "RemoveContainer" containerID="db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20" Feb 26 19:59:07 crc kubenswrapper[4722]: E0226 19:59:07.077598 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\": container with ID starting with db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20 not found: ID does not exist" containerID="db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.077624 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20"} err="failed to get container status \"db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\": rpc error: code = NotFound desc = could not find container \"db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\": container with ID starting with db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20 not found: ID does not exist" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.077638 4722 scope.go:117] "RemoveContainer" containerID="2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04" Feb 26 19:59:07 crc kubenswrapper[4722]: E0226 19:59:07.077955 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\": container with ID starting with 2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04 not found: ID does not exist" containerID="2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.077985 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04"} err="failed to get container status \"2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\": rpc error: code = NotFound desc = could not find container \"2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\": container with ID starting with 2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04 not found: ID does not exist" Feb 26 19:59:08 crc kubenswrapper[4722]: I0226 19:59:08.151773 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:08 crc kubenswrapper[4722]: I0226 19:59:08.153260 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:08 crc kubenswrapper[4722]: I0226 19:59:08.162609 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 26 19:59:09 crc kubenswrapper[4722]: E0226 19:59:09.229788 4722 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" volumeName="registry-storage" Feb 26 19:59:10 crc kubenswrapper[4722]: E0226 19:59:10.065589 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897e4408675655e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:59:04.178459998 +0000 UTC m=+286.715427952,LastTimestamp:2026-02-26 19:59:04.178459998 +0000 UTC m=+286.715427952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.434168 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.434749 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.435121 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.435701 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.436282 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:11 crc kubenswrapper[4722]: I0226 19:59:11.436338 4722 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.436858 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="200ms" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.637484 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="400ms" Feb 26 19:59:12 crc kubenswrapper[4722]: E0226 19:59:12.038904 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="800ms" Feb 26 19:59:12 crc kubenswrapper[4722]: E0226 19:59:12.840093 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="1.6s" Feb 26 19:59:14 crc kubenswrapper[4722]: E0226 19:59:14.440632 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="3.2s" Feb 26 19:59:15 crc kubenswrapper[4722]: I0226 19:59:15.145587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:15 crc kubenswrapper[4722]: I0226 19:59:15.146742 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:15 crc kubenswrapper[4722]: I0226 19:59:15.160791 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:15 crc kubenswrapper[4722]: I0226 19:59:15.160835 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:15 crc kubenswrapper[4722]: E0226 19:59:15.161364 4722 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:15 crc kubenswrapper[4722]: I0226 19:59:15.161916 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:15 crc kubenswrapper[4722]: W0226 19:59:15.186606 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-5ddb3108f346cad41b93908aed0a7338a3aefa83021b51d82d37994d2621806c WatchSource:0}: Error finding container 5ddb3108f346cad41b93908aed0a7338a3aefa83021b51d82d37994d2621806c: Status 404 returned error can't find the container with id 5ddb3108f346cad41b93908aed0a7338a3aefa83021b51d82d37994d2621806c Feb 26 19:59:16 crc kubenswrapper[4722]: I0226 19:59:16.005988 4722 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="7dbb034fd5550b7ca9aea60fcddd66447943a342ac3248a763f2b5645f32cb67" exitCode=0 Feb 26 19:59:16 crc kubenswrapper[4722]: I0226 19:59:16.006123 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"7dbb034fd5550b7ca9aea60fcddd66447943a342ac3248a763f2b5645f32cb67"} Feb 26 19:59:16 crc kubenswrapper[4722]: I0226 19:59:16.006443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5ddb3108f346cad41b93908aed0a7338a3aefa83021b51d82d37994d2621806c"} Feb 26 19:59:16 crc kubenswrapper[4722]: I0226 19:59:16.007001 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:16 crc kubenswrapper[4722]: I0226 19:59:16.007032 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:16 crc kubenswrapper[4722]: I0226 19:59:16.007701 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:16 crc kubenswrapper[4722]: E0226 19:59:16.007776 4722 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.014099 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.014618 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.014655 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e" exitCode=1 Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.014710 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e"} Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.015195 4722 scope.go:117] "RemoveContainer" containerID="96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e" Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.018386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"62cd6353fc78f72c1c1e45ffa17b915497703b70dcaef9fce013a280a54b3720"} Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.018423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"85db130dfc83e2bc8705836d7791859b38a87a56f14426394f5d0fb99879cdc5"} Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.018432 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d2253a7268468e07a8b6813c671f3b14a386c8daac573565b60e41be491df228"} Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.018441 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2f65011115ad3fa7817d9209e063b0bbddb620ac57c4aeb00b731717eb002b51"} Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.028290 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"18afbfb0a409d59d9f569fa1c130c61296ca67a95177940dbd90576b43266455"} Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.028546 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.028789 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.028813 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.032686 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.033618 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.033745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2ac51be1f6ecbaf28dd4211573aafa6d53fd03004c3006819c117c864d302c7f"} Feb 26 19:59:20 crc kubenswrapper[4722]: I0226 19:59:20.162940 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:20 crc kubenswrapper[4722]: I0226 19:59:20.164249 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:20 crc kubenswrapper[4722]: I0226 19:59:20.167785 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:22 crc kubenswrapper[4722]: I0226 19:59:22.312004 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:59:22 crc kubenswrapper[4722]: I0226 19:59:22.312259 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 19:59:22 crc kubenswrapper[4722]: I0226 19:59:22.312430 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 19:59:23 crc kubenswrapper[4722]: I0226 19:59:23.037877 4722 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:23 crc kubenswrapper[4722]: I0226 19:59:23.066149 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:23 crc kubenswrapper[4722]: I0226 19:59:23.066177 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:23 crc kubenswrapper[4722]: I0226 19:59:23.069367 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:23 crc kubenswrapper[4722]: I0226 19:59:23.071694 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="567e4c9c-9a59-4e19-a924-88a8d6e13789" Feb 26 19:59:23 crc kubenswrapper[4722]: I0226 19:59:23.187161 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:59:24 crc kubenswrapper[4722]: I0226 19:59:24.070650 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:24 crc kubenswrapper[4722]: I0226 19:59:24.070685 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:28 crc kubenswrapper[4722]: I0226 19:59:28.157689 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="567e4c9c-9a59-4e19-a924-88a8d6e13789" Feb 26 19:59:32 crc kubenswrapper[4722]: I0226 19:59:32.310490 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 19:59:32 crc kubenswrapper[4722]: I0226 19:59:32.310878 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 19:59:32 crc kubenswrapper[4722]: I0226 19:59:32.314709 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 19:59:32 crc kubenswrapper[4722]: I0226 19:59:32.576178 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 19:59:32 crc kubenswrapper[4722]: I0226 19:59:32.923590 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 19:59:32 crc kubenswrapper[4722]: I0226 19:59:32.933649 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 19:59:33 crc kubenswrapper[4722]: I0226 19:59:33.285776 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 19:59:34 crc kubenswrapper[4722]: I0226 19:59:34.034846 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 26 19:59:34 crc kubenswrapper[4722]: I0226 19:59:34.582180 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 19:59:35 crc kubenswrapper[4722]: I0226 19:59:35.470885 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 19:59:35 crc kubenswrapper[4722]: I0226 19:59:35.537188 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 19:59:35 crc kubenswrapper[4722]: I0226 19:59:35.895495 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.100322 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.176677 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.177535 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.287280 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.362689 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.676962 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.851337 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.041083 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.338260 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.415469 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.423276 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.444205 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.456610 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.479868 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.549590 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.688094 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.774574 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.810683 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.915083 4722 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.920921 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.920990 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.925155 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.943530 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.943503945 podStartE2EDuration="14.943503945s" podCreationTimestamp="2026-02-26 19:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:59:37.940276535 +0000 UTC m=+320.477244469" watchObservedRunningTime="2026-02-26 19:59:37.943503945 +0000 UTC m=+320.480471929" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.402806 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.482040 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.559634 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.615496 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.646102 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.667079 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.759238 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.802766 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.876055 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.937303 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.960952 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.050464 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.088662 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.093028 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.207023 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.241879 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.289833 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.308967 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.327511 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.380731 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.392405 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.504599 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.535480 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.621252 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.623004 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.659731 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.662961 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.721934 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.729786 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.753759 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.761042 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.836896 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.889270 4722 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.032429 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.057877 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.188151 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.244405 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.250620 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.298274 4722 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.358507 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.364942 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.378909 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.416070 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.426328 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.454062 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.495464 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.565742 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.672491 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.672878 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.686542 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.745248 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.791388 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.796334 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.875498 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.930893 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.087119 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.176622 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.216645 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.266542 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.292441 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.378651 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.431839 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.456115 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.544096 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.597216 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.634974 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.658179 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.689772 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.714432 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.741578 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.824761 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.841074 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.056817 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.105571 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.232837 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.310417 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.310508 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.310597 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.312053 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"2ac51be1f6ecbaf28dd4211573aafa6d53fd03004c3006819c117c864d302c7f"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.312353 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://2ac51be1f6ecbaf28dd4211573aafa6d53fd03004c3006819c117c864d302c7f" gracePeriod=30 Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.314030 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.368888 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.370948 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.397220 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.558856 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.614936 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.660637 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.692722 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.792352 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.827354 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.832213 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.899438 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.949662 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.972387 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.008823 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.080711 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.116505 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.176968 4722 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.203049 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.248612 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.252315 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.270268 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.295480 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.410188 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.444111 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.477083 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.548918 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.588954 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.619340 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.644868 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.706232 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.830921 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.868696 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.896530 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.924431 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.962805 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.989630 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.991526 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.997528 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.000656 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.002558 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.060278 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.082279 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.144107 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.247466 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.252341 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.319962 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.329744 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.410716 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.417618 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.544855 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.602430 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.634915 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.641583 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.660322 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.675528 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.744540 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.817214 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.866692 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.870945 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.884668 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.932499 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.071717 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.074734 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.108746 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.239025 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.239607 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.276165 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.292686 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.375580 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.438611 4722 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.439204 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583" gracePeriod=5 Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.513515 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.519833 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.609709 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.612736 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.751611 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.778420 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.874857 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.903019 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.036444 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.039249 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.184075 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.217693 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.232753 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.404170 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.428852 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.505389 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.505495 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.530210 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.533780 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.535956 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.703057 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.729059 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.893120 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.937385 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.008397 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.052155 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.065690 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.080962 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.116689 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.143583 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.290292 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.292727 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.401254 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.418861 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.626586 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.652897 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.755364 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.803535 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.805268 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.846110 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.911105 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.958583 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.031326 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.086408 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.234242 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.283102 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.284990 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.370226 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.454334 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.593033 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.828827 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.878627 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.890204 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.984626 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 19:59:49 crc kubenswrapper[4722]: I0226 19:59:49.172583 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 19:59:49 crc kubenswrapper[4722]: I0226 19:59:49.251235 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 19:59:49 crc kubenswrapper[4722]: I0226 19:59:49.380337 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 26 19:59:49 crc kubenswrapper[4722]: I0226 19:59:49.502900 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 19:59:49 crc kubenswrapper[4722]: I0226 19:59:49.864288 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 19:59:49 crc kubenswrapper[4722]: I0226 19:59:49.893963 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.108915 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.240989 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.370889 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.402173 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.461938 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.600205 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.604388 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.659942 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.716266 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.759908 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.791232 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.884762 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.018690 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.018757 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.201559 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.201688 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.201738 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.201944 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202013 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202053 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202083 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202126 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202206 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202435 4722 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202452 4722 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202462 4722 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202579 4722 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.213375 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.244251 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.244300 4722 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583" exitCode=137 Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.244337 4722 scope.go:117] "RemoveContainer" containerID="e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.244420 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.271575 4722 scope.go:117] "RemoveContainer" containerID="e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583" Feb 26 19:59:51 crc kubenswrapper[4722]: E0226 19:59:51.272502 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583\": container with ID starting with e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583 not found: ID does not exist" containerID="e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.272562 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583"} err="failed to get container status \"e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583\": rpc error: code = NotFound desc = could not find container \"e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583\": container with ID starting with e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583 not found: ID does not exist" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.303405 4722 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.331550 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.381785 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 26 19:59:52 crc kubenswrapper[4722]: I0226 19:59:52.052644 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 19:59:52 crc kubenswrapper[4722]: I0226 19:59:52.171865 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 26 19:59:52 crc kubenswrapper[4722]: I0226 19:59:52.471453 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 19:59:52 crc kubenswrapper[4722]: I0226 19:59:52.608021 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 26 19:59:52 crc kubenswrapper[4722]: I0226 19:59:52.759078 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.298585 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jpsrd"] Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.302436 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jpsrd" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="registry-server" containerID="cri-o://6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9" gracePeriod=30 Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.310131 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2llb2"] Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.310450 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2llb2" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="registry-server" containerID="cri-o://c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62" gracePeriod=30 Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.320868 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vpr4p"] Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.321081 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" containerName="marketplace-operator" containerID="cri-o://12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb" gracePeriod=30 Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.332211 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbwt"] Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.332445 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jxbwt" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="registry-server" containerID="cri-o://22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093" gracePeriod=30 Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.334400 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fn7tr"] Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.334616 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fn7tr" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="registry-server" containerID="cri-o://a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e" gracePeriod=30 Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.626035 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vpr4p container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.626387 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.715355 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.840338 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2llb2" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.843227 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.848521 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.877224 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54vwk\" (UniqueName: \"kubernetes.io/projected/94176c67-3742-4347-83c8-d467d4eb6be7-kube-api-access-54vwk\") pod \"94176c67-3742-4347-83c8-d467d4eb6be7\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.877531 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-utilities\") pod \"94176c67-3742-4347-83c8-d467d4eb6be7\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.877618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-catalog-content\") pod \"94176c67-3742-4347-83c8-d467d4eb6be7\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.896718 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.898731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-utilities" (OuterVolumeSpecName: "utilities") pod "94176c67-3742-4347-83c8-d467d4eb6be7" (UID: "94176c67-3742-4347-83c8-d467d4eb6be7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.907510 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94176c67-3742-4347-83c8-d467d4eb6be7-kube-api-access-54vwk" (OuterVolumeSpecName: "kube-api-access-54vwk") pod "94176c67-3742-4347-83c8-d467d4eb6be7" (UID: "94176c67-3742-4347-83c8-d467d4eb6be7"). InnerVolumeSpecName "kube-api-access-54vwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.942876 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94176c67-3742-4347-83c8-d467d4eb6be7" (UID: "94176c67-3742-4347-83c8-d467d4eb6be7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978672 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-catalog-content\") pod \"2299b352-9475-4e85-9a5b-cb08aea743c2\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978723 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-catalog-content\") pod \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978766 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t4ng\" (UniqueName: \"kubernetes.io/projected/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-kube-api-access-7t4ng\") pod \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978793 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz7z5\" (UniqueName: \"kubernetes.io/projected/2299b352-9475-4e85-9a5b-cb08aea743c2-kube-api-access-tz7z5\") pod \"2299b352-9475-4e85-9a5b-cb08aea743c2\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978853 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-utilities\") pod \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978891 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z29x\" (UniqueName: \"kubernetes.io/projected/4610ca54-dc80-47ad-b90f-61dffe47a076-kube-api-access-4z29x\") pod \"4610ca54-dc80-47ad-b90f-61dffe47a076\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978931 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-utilities\") pod \"2299b352-9475-4e85-9a5b-cb08aea743c2\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978959 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-catalog-content\") pod \"4610ca54-dc80-47ad-b90f-61dffe47a076\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.979001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-utilities\") pod \"4610ca54-dc80-47ad-b90f-61dffe47a076\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.980016 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-utilities" (OuterVolumeSpecName: "utilities") pod "2299b352-9475-4e85-9a5b-cb08aea743c2" (UID: "2299b352-9475-4e85-9a5b-cb08aea743c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.980041 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-utilities" (OuterVolumeSpecName: "utilities") pod "4610ca54-dc80-47ad-b90f-61dffe47a076" (UID: "4610ca54-dc80-47ad-b90f-61dffe47a076"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.980081 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.980097 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.980111 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54vwk\" (UniqueName: \"kubernetes.io/projected/94176c67-3742-4347-83c8-d467d4eb6be7-kube-api-access-54vwk\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.980888 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-utilities" (OuterVolumeSpecName: "utilities") pod "db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" (UID: "db7129a7-c8b2-44c5-8133-cb1d47bbdd4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.982162 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4610ca54-dc80-47ad-b90f-61dffe47a076-kube-api-access-4z29x" (OuterVolumeSpecName: "kube-api-access-4z29x") pod "4610ca54-dc80-47ad-b90f-61dffe47a076" (UID: "4610ca54-dc80-47ad-b90f-61dffe47a076"). InnerVolumeSpecName "kube-api-access-4z29x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.982448 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2299b352-9475-4e85-9a5b-cb08aea743c2-kube-api-access-tz7z5" (OuterVolumeSpecName: "kube-api-access-tz7z5") pod "2299b352-9475-4e85-9a5b-cb08aea743c2" (UID: "2299b352-9475-4e85-9a5b-cb08aea743c2"). InnerVolumeSpecName "kube-api-access-tz7z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.982485 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-kube-api-access-7t4ng" (OuterVolumeSpecName: "kube-api-access-7t4ng") pod "db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" (UID: "db7129a7-c8b2-44c5-8133-cb1d47bbdd4e"). InnerVolumeSpecName "kube-api-access-7t4ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.009409 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" (UID: "db7129a7-c8b2-44c5-8133-cb1d47bbdd4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.040039 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4610ca54-dc80-47ad-b90f-61dffe47a076" (UID: "4610ca54-dc80-47ad-b90f-61dffe47a076"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081076 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmdls\" (UniqueName: \"kubernetes.io/projected/21b11897-db24-4d65-a438-d3695ccee5fc-kube-api-access-xmdls\") pod \"21b11897-db24-4d65-a438-d3695ccee5fc\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081238 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-trusted-ca\") pod \"21b11897-db24-4d65-a438-d3695ccee5fc\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081333 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-operator-metrics\") pod \"21b11897-db24-4d65-a438-d3695ccee5fc\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081546 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t4ng\" (UniqueName: \"kubernetes.io/projected/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-kube-api-access-7t4ng\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081572 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz7z5\" (UniqueName: \"kubernetes.io/projected/2299b352-9475-4e85-9a5b-cb08aea743c2-kube-api-access-tz7z5\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081584 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081598 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z29x\" (UniqueName: \"kubernetes.io/projected/4610ca54-dc80-47ad-b90f-61dffe47a076-kube-api-access-4z29x\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081611 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081623 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081633 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081657 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.082262 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "21b11897-db24-4d65-a438-d3695ccee5fc" (UID: "21b11897-db24-4d65-a438-d3695ccee5fc"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.084455 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b11897-db24-4d65-a438-d3695ccee5fc-kube-api-access-xmdls" (OuterVolumeSpecName: "kube-api-access-xmdls") pod "21b11897-db24-4d65-a438-d3695ccee5fc" (UID: "21b11897-db24-4d65-a438-d3695ccee5fc"). InnerVolumeSpecName "kube-api-access-xmdls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.084979 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "21b11897-db24-4d65-a438-d3695ccee5fc" (UID: "21b11897-db24-4d65-a438-d3695ccee5fc"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.106983 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2299b352-9475-4e85-9a5b-cb08aea743c2" (UID: "2299b352-9475-4e85-9a5b-cb08aea743c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.182972 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.183100 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.183280 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.183353 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmdls\" (UniqueName: \"kubernetes.io/projected/21b11897-db24-4d65-a438-d3695ccee5fc-kube-api-access-xmdls\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.326964 4722 generic.go:334] "Generic (PLEG): container finished" podID="21b11897-db24-4d65-a438-d3695ccee5fc" containerID="12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb" exitCode=0 Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.327095 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.327164 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" event={"ID":"21b11897-db24-4d65-a438-d3695ccee5fc","Type":"ContainerDied","Data":"12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.327221 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" event={"ID":"21b11897-db24-4d65-a438-d3695ccee5fc","Type":"ContainerDied","Data":"9b148d8ca20afe21d57593040dc2d8cf41d9dc223fbeb9d749578f677863c31a"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.327245 4722 scope.go:117] "RemoveContainer" containerID="12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.331813 4722 generic.go:334] "Generic (PLEG): container finished" podID="94176c67-3742-4347-83c8-d467d4eb6be7" containerID="6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9" exitCode=0 Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.331873 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsrd" event={"ID":"94176c67-3742-4347-83c8-d467d4eb6be7","Type":"ContainerDied","Data":"6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.331901 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsrd" event={"ID":"94176c67-3742-4347-83c8-d467d4eb6be7","Type":"ContainerDied","Data":"12f48da69d094f4b7c738d277b25810015d5ccecbc024569a487139c88043f02"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.331976 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.335354 4722 generic.go:334] "Generic (PLEG): container finished" podID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerID="22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093" exitCode=0 Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.335416 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbwt" event={"ID":"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e","Type":"ContainerDied","Data":"22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.335443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbwt" event={"ID":"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e","Type":"ContainerDied","Data":"10b9edd74c60c90742be9dacd2d93a4b35e0536412f2688a800dc04c6aa67ba9"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.335523 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.337741 4722 generic.go:334] "Generic (PLEG): container finished" podID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerID="c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62" exitCode=0 Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.339807 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2llb2" event={"ID":"4610ca54-dc80-47ad-b90f-61dffe47a076","Type":"ContainerDied","Data":"c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.339852 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2llb2" event={"ID":"4610ca54-dc80-47ad-b90f-61dffe47a076","Type":"ContainerDied","Data":"4fca73ce71aaaf439cad76d8ce18fff9edf06fbb6f44d0268b5238e19b9fffd4"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.339943 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2llb2" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.348566 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vpr4p"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.349315 4722 scope.go:117] "RemoveContainer" containerID="12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.349739 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb\": container with ID starting with 12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb not found: ID does not exist" containerID="12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.349779 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb"} err="failed to get container status \"12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb\": rpc error: code = NotFound desc = could not find container \"12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb\": container with ID starting with 12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.349824 4722 scope.go:117] "RemoveContainer" containerID="6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.350162 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.350166 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerDied","Data":"a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.350814 4722 generic.go:334] "Generic (PLEG): container finished" podID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerID="a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e" exitCode=0 Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.350868 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerDied","Data":"a3f0e753684439dec6af77ce80288768378c0fdf34847bf9d0c6a937239c834a"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.353767 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vpr4p"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.363448 4722 scope.go:117] "RemoveContainer" containerID="8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.366049 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jpsrd"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.371747 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jpsrd"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.378790 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2llb2"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.382052 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2llb2"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.385080 4722 scope.go:117] "RemoveContainer" containerID="5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.387768 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fn7tr"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.391850 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fn7tr"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.396539 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbwt"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.399427 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbwt"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.403051 4722 scope.go:117] "RemoveContainer" containerID="6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.403422 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9\": container with ID starting with 6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9 not found: ID does not exist" containerID="6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.403457 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9"} err="failed to get container status \"6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9\": rpc error: code = NotFound desc = could not find container \"6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9\": container with ID starting with 6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.403483 4722 scope.go:117] "RemoveContainer" containerID="8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.403823 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5\": container with ID starting with 8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5 not found: ID does not exist" containerID="8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.403857 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5"} err="failed to get container status \"8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5\": rpc error: code = NotFound desc = could not find container \"8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5\": container with ID starting with 8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.403882 4722 scope.go:117] "RemoveContainer" containerID="5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.404174 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332\": container with ID starting with 5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332 not found: ID does not exist" containerID="5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.404223 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332"} err="failed to get container status \"5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332\": rpc error: code = NotFound desc = could not find container \"5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332\": container with ID starting with 5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.404258 4722 scope.go:117] "RemoveContainer" containerID="22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.416387 4722 scope.go:117] "RemoveContainer" containerID="e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.434437 4722 scope.go:117] "RemoveContainer" containerID="b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.446334 4722 scope.go:117] "RemoveContainer" containerID="22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.446943 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093\": container with ID starting with 22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093 not found: ID does not exist" containerID="22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.446981 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093"} err="failed to get container status \"22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093\": rpc error: code = NotFound desc = could not find container \"22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093\": container with ID starting with 22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.447009 4722 scope.go:117] "RemoveContainer" containerID="e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.447397 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65\": container with ID starting with e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65 not found: ID does not exist" containerID="e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.447440 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65"} err="failed to get container status \"e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65\": rpc error: code = NotFound desc = could not find container \"e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65\": container with ID starting with e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.447469 4722 scope.go:117] "RemoveContainer" containerID="b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.447828 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1\": container with ID starting with b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1 not found: ID does not exist" containerID="b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.447859 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1"} err="failed to get container status \"b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1\": rpc error: code = NotFound desc = could not find container \"b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1\": container with ID starting with b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.447880 4722 scope.go:117] "RemoveContainer" containerID="c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.458572 4722 scope.go:117] "RemoveContainer" containerID="01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.472468 4722 scope.go:117] "RemoveContainer" containerID="6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.506795 4722 scope.go:117] "RemoveContainer" containerID="c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.507223 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62\": container with ID starting with c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62 not found: ID does not exist" containerID="c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.507264 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62"} err="failed to get container status \"c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62\": rpc error: code = NotFound desc = could not find container \"c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62\": container with ID starting with c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.507292 4722 scope.go:117] "RemoveContainer" containerID="01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.507633 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9\": container with ID starting with 01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9 not found: ID does not exist" containerID="01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.507665 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9"} err="failed to get container status \"01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9\": rpc error: code = NotFound desc = could not find container \"01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9\": container with ID starting with 01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.507688 4722 scope.go:117] "RemoveContainer" containerID="6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.507964 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4\": container with ID starting with 6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4 not found: ID does not exist" containerID="6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.507998 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4"} err="failed to get container status \"6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4\": rpc error: code = NotFound desc = could not find container \"6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4\": container with ID starting with 6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.508020 4722 scope.go:117] "RemoveContainer" containerID="a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.521576 4722 scope.go:117] "RemoveContainer" containerID="8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.533274 4722 scope.go:117] "RemoveContainer" containerID="a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.548915 4722 scope.go:117] "RemoveContainer" containerID="a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.549237 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e\": container with ID starting with a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e not found: ID does not exist" containerID="a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.549266 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e"} err="failed to get container status \"a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e\": rpc error: code = NotFound desc = could not find container \"a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e\": container with ID starting with a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.549288 4722 scope.go:117] "RemoveContainer" containerID="8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.549508 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333\": container with ID starting with 8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333 not found: ID does not exist" containerID="8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.549530 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333"} err="failed to get container status \"8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333\": rpc error: code = NotFound desc = could not find container \"8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333\": container with ID starting with 8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.549545 4722 scope.go:117] "RemoveContainer" containerID="a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.549754 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f\": container with ID starting with a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f not found: ID does not exist" containerID="a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.549781 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f"} err="failed to get container status \"a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f\": rpc error: code = NotFound desc = could not find container \"a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f\": container with ID starting with a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f not found: ID does not exist" Feb 26 20:00:06 crc kubenswrapper[4722]: I0226 20:00:06.156562 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" path="/var/lib/kubelet/pods/21b11897-db24-4d65-a438-d3695ccee5fc/volumes" Feb 26 20:00:06 crc kubenswrapper[4722]: I0226 20:00:06.157476 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" path="/var/lib/kubelet/pods/2299b352-9475-4e85-9a5b-cb08aea743c2/volumes" Feb 26 20:00:06 crc kubenswrapper[4722]: I0226 20:00:06.158009 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" path="/var/lib/kubelet/pods/4610ca54-dc80-47ad-b90f-61dffe47a076/volumes" Feb 26 20:00:06 crc kubenswrapper[4722]: I0226 20:00:06.158560 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" path="/var/lib/kubelet/pods/94176c67-3742-4347-83c8-d467d4eb6be7/volumes" Feb 26 20:00:06 crc kubenswrapper[4722]: I0226 20:00:06.159093 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" path="/var/lib/kubelet/pods/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e/volumes" Feb 26 20:00:12 crc kubenswrapper[4722]: I0226 20:00:12.404203 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 26 20:00:12 crc kubenswrapper[4722]: I0226 20:00:12.406354 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 20:00:12 crc kubenswrapper[4722]: I0226 20:00:12.407792 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 20:00:12 crc kubenswrapper[4722]: I0226 20:00:12.407835 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="2ac51be1f6ecbaf28dd4211573aafa6d53fd03004c3006819c117c864d302c7f" exitCode=137 Feb 26 20:00:12 crc kubenswrapper[4722]: I0226 20:00:12.407863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"2ac51be1f6ecbaf28dd4211573aafa6d53fd03004c3006819c117c864d302c7f"} Feb 26 20:00:12 crc kubenswrapper[4722]: I0226 20:00:12.407894 4722 scope.go:117] "RemoveContainer" containerID="96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e" Feb 26 20:00:13 crc kubenswrapper[4722]: I0226 20:00:13.414838 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 26 20:00:13 crc kubenswrapper[4722]: I0226 20:00:13.416518 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 20:00:13 crc kubenswrapper[4722]: I0226 20:00:13.416577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"203ec754eb37a5c53c9f85223224fa4e767e237b50a44c91acd372ea49de7508"} Feb 26 20:00:22 crc kubenswrapper[4722]: I0226 20:00:22.310961 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 20:00:22 crc kubenswrapper[4722]: I0226 20:00:22.315852 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 20:00:22 crc kubenswrapper[4722]: I0226 20:00:22.467291 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 20:00:22 crc kubenswrapper[4722]: I0226 20:00:22.470782 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.319697 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535600-2lg25"] Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.321368 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.321473 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.321563 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.321628 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.321702 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.321781 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.321848 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" containerName="marketplace-operator" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.321905 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" containerName="marketplace-operator" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.321964 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.322021 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.322088 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.322165 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.322262 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.322344 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.322431 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.322517 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.322600 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.322687 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.322773 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.322847 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.322929 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323006 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.323093 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323200 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.323295 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d27a2962-12b7-476f-a95f-b4f161165950" containerName="installer" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323377 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27a2962-12b7-476f-a95f-b4f161165950" containerName="installer" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.323460 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323540 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.323619 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323703 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323903 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d27a2962-12b7-476f-a95f-b4f161165950" containerName="installer" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323986 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.324063 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.324159 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.324243 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.324342 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" containerName="marketplace-operator" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.324430 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.324938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.326802 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg"] Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.326894 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.327168 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.327294 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.329035 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.336549 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535600-2lg25"] Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.336687 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.336920 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.341216 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg"] Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.423536 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4nc7"] Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.424250 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.436251 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4nc7"] Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.437218 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.437260 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.437298 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.437856 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.446409 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477191 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf47n\" (UniqueName: \"kubernetes.io/projected/7115d78f-2013-4549-ab88-5fde72d4267f-kube-api-access-pf47n\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35655c90-2927-4858-a067-3e520498cd26-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj5tq\" (UniqueName: \"kubernetes.io/projected/6f39028f-65ac-4f51-a946-4cc88d7dc31b-kube-api-access-sj5tq\") pod \"auto-csr-approver-29535600-2lg25\" (UID: \"6f39028f-65ac-4f51-a946-4cc88d7dc31b\") " pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477300 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7115d78f-2013-4549-ab88-5fde72d4267f-secret-volume\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477331 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/35655c90-2927-4858-a067-3e520498cd26-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477384 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4kf5\" (UniqueName: \"kubernetes.io/projected/35655c90-2927-4858-a067-3e520498cd26-kube-api-access-r4kf5\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477448 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7115d78f-2013-4549-ab88-5fde72d4267f-config-volume\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578066 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj5tq\" (UniqueName: \"kubernetes.io/projected/6f39028f-65ac-4f51-a946-4cc88d7dc31b-kube-api-access-sj5tq\") pod \"auto-csr-approver-29535600-2lg25\" (UID: \"6f39028f-65ac-4f51-a946-4cc88d7dc31b\") " pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7115d78f-2013-4549-ab88-5fde72d4267f-secret-volume\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/35655c90-2927-4858-a067-3e520498cd26-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4kf5\" (UniqueName: \"kubernetes.io/projected/35655c90-2927-4858-a067-3e520498cd26-kube-api-access-r4kf5\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578211 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7115d78f-2013-4549-ab88-5fde72d4267f-config-volume\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578237 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf47n\" (UniqueName: \"kubernetes.io/projected/7115d78f-2013-4549-ab88-5fde72d4267f-kube-api-access-pf47n\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578256 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35655c90-2927-4858-a067-3e520498cd26-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.579262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35655c90-2927-4858-a067-3e520498cd26-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.579974 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7115d78f-2013-4549-ab88-5fde72d4267f-config-volume\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.584625 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/35655c90-2927-4858-a067-3e520498cd26-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.587496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7115d78f-2013-4549-ab88-5fde72d4267f-secret-volume\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.611432 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj5tq\" (UniqueName: \"kubernetes.io/projected/6f39028f-65ac-4f51-a946-4cc88d7dc31b-kube-api-access-sj5tq\") pod \"auto-csr-approver-29535600-2lg25\" (UID: \"6f39028f-65ac-4f51-a946-4cc88d7dc31b\") " pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.616386 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf47n\" (UniqueName: \"kubernetes.io/projected/7115d78f-2013-4549-ab88-5fde72d4267f-kube-api-access-pf47n\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.624077 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4kf5\" (UniqueName: \"kubernetes.io/projected/35655c90-2927-4858-a067-3e520498cd26-kube-api-access-r4kf5\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.641874 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.647531 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.760601 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.045820 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535600-2lg25"] Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.098195 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg"] Feb 26 20:00:33 crc kubenswrapper[4722]: W0226 20:00:33.102745 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7115d78f_2013_4549_ab88_5fde72d4267f.slice/crio-65859f14d68c39c4633f55ce34e6c63b8e6c7933ec585a2feecbbf67fc5a9aee WatchSource:0}: Error finding container 65859f14d68c39c4633f55ce34e6c63b8e6c7933ec585a2feecbbf67fc5a9aee: Status 404 returned error can't find the container with id 65859f14d68c39c4633f55ce34e6c63b8e6c7933ec585a2feecbbf67fc5a9aee Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.174079 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4nc7"] Feb 26 20:00:33 crc kubenswrapper[4722]: W0226 20:00:33.185450 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35655c90_2927_4858_a067_3e520498cd26.slice/crio-a7622965c340d3e9f26366bf539fd437f81a99088e879be46b90e36b75140c50 WatchSource:0}: Error finding container a7622965c340d3e9f26366bf539fd437f81a99088e879be46b90e36b75140c50: Status 404 returned error can't find the container with id a7622965c340d3e9f26366bf539fd437f81a99088e879be46b90e36b75140c50 Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.529860 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" event={"ID":"35655c90-2927-4858-a067-3e520498cd26","Type":"ContainerStarted","Data":"904a9c5a18d2204e6377ec0daac56c9546a57abf757084121c76198da754c8b4"} Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.529909 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" event={"ID":"35655c90-2927-4858-a067-3e520498cd26","Type":"ContainerStarted","Data":"a7622965c340d3e9f26366bf539fd437f81a99088e879be46b90e36b75140c50"} Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.530051 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.531571 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535600-2lg25" event={"ID":"6f39028f-65ac-4f51-a946-4cc88d7dc31b","Type":"ContainerStarted","Data":"1abbce58d2ac97576f4d8e000a69c6fc11eec1914e76ecdc515115b415a30f06"} Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.531818 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-n4nc7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" start-of-body= Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.531866 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" podUID="35655c90-2927-4858-a067-3e520498cd26" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.533210 4722 generic.go:334] "Generic (PLEG): container finished" podID="7115d78f-2013-4549-ab88-5fde72d4267f" containerID="9f8338dca0289df96314b3dfe6dd02889f044c81b0c1093e855bda6ad20cc34c" exitCode=0 Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.533246 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" event={"ID":"7115d78f-2013-4549-ab88-5fde72d4267f","Type":"ContainerDied","Data":"9f8338dca0289df96314b3dfe6dd02889f044c81b0c1093e855bda6ad20cc34c"} Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.533416 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" event={"ID":"7115d78f-2013-4549-ab88-5fde72d4267f","Type":"ContainerStarted","Data":"65859f14d68c39c4633f55ce34e6c63b8e6c7933ec585a2feecbbf67fc5a9aee"} Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.550561 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" podStartSLOduration=1.550543638 podStartE2EDuration="1.550543638s" podCreationTimestamp="2026-02-26 20:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:00:33.548935004 +0000 UTC m=+376.085902928" watchObservedRunningTime="2026-02-26 20:00:33.550543638 +0000 UTC m=+376.087511562" Feb 26 20:00:34 crc kubenswrapper[4722]: I0226 20:00:34.541013 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.348175 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.530783 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7115d78f-2013-4549-ab88-5fde72d4267f-config-volume\") pod \"7115d78f-2013-4549-ab88-5fde72d4267f\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.531322 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf47n\" (UniqueName: \"kubernetes.io/projected/7115d78f-2013-4549-ab88-5fde72d4267f-kube-api-access-pf47n\") pod \"7115d78f-2013-4549-ab88-5fde72d4267f\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.531378 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7115d78f-2013-4549-ab88-5fde72d4267f-secret-volume\") pod \"7115d78f-2013-4549-ab88-5fde72d4267f\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.531412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7115d78f-2013-4549-ab88-5fde72d4267f-config-volume" (OuterVolumeSpecName: "config-volume") pod "7115d78f-2013-4549-ab88-5fde72d4267f" (UID: "7115d78f-2013-4549-ab88-5fde72d4267f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.531779 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7115d78f-2013-4549-ab88-5fde72d4267f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.537764 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7115d78f-2013-4549-ab88-5fde72d4267f-kube-api-access-pf47n" (OuterVolumeSpecName: "kube-api-access-pf47n") pod "7115d78f-2013-4549-ab88-5fde72d4267f" (UID: "7115d78f-2013-4549-ab88-5fde72d4267f"). InnerVolumeSpecName "kube-api-access-pf47n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.537770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7115d78f-2013-4549-ab88-5fde72d4267f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7115d78f-2013-4549-ab88-5fde72d4267f" (UID: "7115d78f-2013-4549-ab88-5fde72d4267f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.543843 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" event={"ID":"7115d78f-2013-4549-ab88-5fde72d4267f","Type":"ContainerDied","Data":"65859f14d68c39c4633f55ce34e6c63b8e6c7933ec585a2feecbbf67fc5a9aee"} Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.543887 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65859f14d68c39c4633f55ce34e6c63b8e6c7933ec585a2feecbbf67fc5a9aee" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.543952 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.547125 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535600-2lg25" event={"ID":"6f39028f-65ac-4f51-a946-4cc88d7dc31b","Type":"ContainerStarted","Data":"f6af5fba5101db3b527e91e588fb071f728196c140b56f368badc532d02686d0"} Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.568203 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535600-2lg25" podStartSLOduration=2.596409879 podStartE2EDuration="3.568178991s" podCreationTimestamp="2026-02-26 20:00:32 +0000 UTC" firstStartedPulling="2026-02-26 20:00:33.051456837 +0000 UTC m=+375.588424761" lastFinishedPulling="2026-02-26 20:00:34.023225949 +0000 UTC m=+376.560193873" observedRunningTime="2026-02-26 20:00:35.566334061 +0000 UTC m=+378.103301985" watchObservedRunningTime="2026-02-26 20:00:35.568178991 +0000 UTC m=+378.105146925" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.633495 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf47n\" (UniqueName: \"kubernetes.io/projected/7115d78f-2013-4549-ab88-5fde72d4267f-kube-api-access-pf47n\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.633550 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7115d78f-2013-4549-ab88-5fde72d4267f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:36 crc kubenswrapper[4722]: I0226 20:00:36.552276 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f39028f-65ac-4f51-a946-4cc88d7dc31b" containerID="f6af5fba5101db3b527e91e588fb071f728196c140b56f368badc532d02686d0" exitCode=0 Feb 26 20:00:36 crc kubenswrapper[4722]: I0226 20:00:36.552312 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535600-2lg25" event={"ID":"6f39028f-65ac-4f51-a946-4cc88d7dc31b","Type":"ContainerDied","Data":"f6af5fba5101db3b527e91e588fb071f728196c140b56f368badc532d02686d0"} Feb 26 20:00:37 crc kubenswrapper[4722]: I0226 20:00:37.864152 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:38 crc kubenswrapper[4722]: I0226 20:00:38.058951 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj5tq\" (UniqueName: \"kubernetes.io/projected/6f39028f-65ac-4f51-a946-4cc88d7dc31b-kube-api-access-sj5tq\") pod \"6f39028f-65ac-4f51-a946-4cc88d7dc31b\" (UID: \"6f39028f-65ac-4f51-a946-4cc88d7dc31b\") " Feb 26 20:00:38 crc kubenswrapper[4722]: I0226 20:00:38.065372 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f39028f-65ac-4f51-a946-4cc88d7dc31b-kube-api-access-sj5tq" (OuterVolumeSpecName: "kube-api-access-sj5tq") pod "6f39028f-65ac-4f51-a946-4cc88d7dc31b" (UID: "6f39028f-65ac-4f51-a946-4cc88d7dc31b"). InnerVolumeSpecName "kube-api-access-sj5tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:38 crc kubenswrapper[4722]: I0226 20:00:38.160118 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj5tq\" (UniqueName: \"kubernetes.io/projected/6f39028f-65ac-4f51-a946-4cc88d7dc31b-kube-api-access-sj5tq\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:38 crc kubenswrapper[4722]: I0226 20:00:38.562591 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535600-2lg25" event={"ID":"6f39028f-65ac-4f51-a946-4cc88d7dc31b","Type":"ContainerDied","Data":"1abbce58d2ac97576f4d8e000a69c6fc11eec1914e76ecdc515115b415a30f06"} Feb 26 20:00:38 crc kubenswrapper[4722]: I0226 20:00:38.562630 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1abbce58d2ac97576f4d8e000a69c6fc11eec1914e76ecdc515115b415a30f06" Feb 26 20:00:38 crc kubenswrapper[4722]: I0226 20:00:38.562681 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:53 crc kubenswrapper[4722]: I0226 20:00:53.487836 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:00:53 crc kubenswrapper[4722]: I0226 20:00:53.488367 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.808660 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9dg5w"] Feb 26 20:01:01 crc kubenswrapper[4722]: E0226 20:01:01.809436 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7115d78f-2013-4549-ab88-5fde72d4267f" containerName="collect-profiles" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.809451 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7115d78f-2013-4549-ab88-5fde72d4267f" containerName="collect-profiles" Feb 26 20:01:01 crc kubenswrapper[4722]: E0226 20:01:01.809467 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f39028f-65ac-4f51-a946-4cc88d7dc31b" containerName="oc" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.809474 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f39028f-65ac-4f51-a946-4cc88d7dc31b" containerName="oc" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.809567 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f39028f-65ac-4f51-a946-4cc88d7dc31b" containerName="oc" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.809587 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7115d78f-2013-4549-ab88-5fde72d4267f" containerName="collect-profiles" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.810603 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.813284 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.822231 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dg5w"] Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.934662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-utilities\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.934751 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nf9b\" (UniqueName: \"kubernetes.io/projected/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-kube-api-access-9nf9b\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.934935 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-catalog-content\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.010378 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sj5r4"] Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.011542 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.014285 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.023338 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sj5r4"] Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.035895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-catalog-content\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.035934 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-utilities\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.035993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nf9b\" (UniqueName: \"kubernetes.io/projected/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-kube-api-access-9nf9b\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.036662 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-utilities\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.036800 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-catalog-content\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.059564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nf9b\" (UniqueName: \"kubernetes.io/projected/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-kube-api-access-9nf9b\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.136584 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-utilities\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.136624 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-catalog-content\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.136652 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx47k\" (UniqueName: \"kubernetes.io/projected/ededdfa7-a21a-4901-bb64-a8f9923a663a-kube-api-access-nx47k\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.190055 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.237662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx47k\" (UniqueName: \"kubernetes.io/projected/ededdfa7-a21a-4901-bb64-a8f9923a663a-kube-api-access-nx47k\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.238406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-utilities\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.238560 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-catalog-content\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.238938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-utilities\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.239129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-catalog-content\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.259824 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx47k\" (UniqueName: \"kubernetes.io/projected/ededdfa7-a21a-4901-bb64-a8f9923a663a-kube-api-access-nx47k\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.328998 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.411898 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dg5w"] Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.558366 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sj5r4"] Feb 26 20:01:02 crc kubenswrapper[4722]: W0226 20:01:02.631794 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podededdfa7_a21a_4901_bb64_a8f9923a663a.slice/crio-8360f30e07db737664b3efe7f2686a21bb31147b31bb7a1bb1e1e8394c5a2f04 WatchSource:0}: Error finding container 8360f30e07db737664b3efe7f2686a21bb31147b31bb7a1bb1e1e8394c5a2f04: Status 404 returned error can't find the container with id 8360f30e07db737664b3efe7f2686a21bb31147b31bb7a1bb1e1e8394c5a2f04 Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.686608 4722 generic.go:334] "Generic (PLEG): container finished" podID="cf038f1a-6cde-4f79-b9c9-06ecb8807b1a" containerID="38ec66c64d5e8b834b98383e2503ed3f8d938f9ed14e7efb81474985e2dd77ea" exitCode=0 Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.686710 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dg5w" event={"ID":"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a","Type":"ContainerDied","Data":"38ec66c64d5e8b834b98383e2503ed3f8d938f9ed14e7efb81474985e2dd77ea"} Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.686761 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dg5w" event={"ID":"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a","Type":"ContainerStarted","Data":"671e7530f18dd043c9451d853b17f138c57fcfb12b0ac6d34db8a477374df674"} Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.690175 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerStarted","Data":"8360f30e07db737664b3efe7f2686a21bb31147b31bb7a1bb1e1e8394c5a2f04"} Feb 26 20:01:03 crc kubenswrapper[4722]: I0226 20:01:03.696505 4722 generic.go:334] "Generic (PLEG): container finished" podID="cf038f1a-6cde-4f79-b9c9-06ecb8807b1a" containerID="b82acb256a9da2e5f22b28c30c5e26254c0e927fee57355ecbc238c9977b3008" exitCode=0 Feb 26 20:01:03 crc kubenswrapper[4722]: I0226 20:01:03.696596 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dg5w" event={"ID":"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a","Type":"ContainerDied","Data":"b82acb256a9da2e5f22b28c30c5e26254c0e927fee57355ecbc238c9977b3008"} Feb 26 20:01:03 crc kubenswrapper[4722]: I0226 20:01:03.698463 4722 generic.go:334] "Generic (PLEG): container finished" podID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerID="18f90a7fe5a5aa6de1fee968e36e72c0c5ef2c92982604086e5b43bc89fb6c6f" exitCode=0 Feb 26 20:01:03 crc kubenswrapper[4722]: I0226 20:01:03.698502 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerDied","Data":"18f90a7fe5a5aa6de1fee968e36e72c0c5ef2c92982604086e5b43bc89fb6c6f"} Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.411600 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mklbp"] Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.413246 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.414940 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.420896 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mklbp"] Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.563497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xjjt\" (UniqueName: \"kubernetes.io/projected/7f42259f-9c95-4fc1-af4a-711a171f8ea3-kube-api-access-6xjjt\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.563582 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f42259f-9c95-4fc1-af4a-711a171f8ea3-catalog-content\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.563692 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f42259f-9c95-4fc1-af4a-711a171f8ea3-utilities\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.608441 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8tbpk"] Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.609685 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.611578 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.621032 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8tbpk"] Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.664944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f42259f-9c95-4fc1-af4a-711a171f8ea3-utilities\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.665013 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xjjt\" (UniqueName: \"kubernetes.io/projected/7f42259f-9c95-4fc1-af4a-711a171f8ea3-kube-api-access-6xjjt\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.665043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f42259f-9c95-4fc1-af4a-711a171f8ea3-catalog-content\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.665433 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f42259f-9c95-4fc1-af4a-711a171f8ea3-utilities\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.665474 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f42259f-9c95-4fc1-af4a-711a171f8ea3-catalog-content\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.686010 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xjjt\" (UniqueName: \"kubernetes.io/projected/7f42259f-9c95-4fc1-af4a-711a171f8ea3-kube-api-access-6xjjt\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.707461 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerStarted","Data":"d41b83d978b9b4d79559a191aea1245600d05f7eb86575e2a7b748bbc06ea3bb"} Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.710027 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dg5w" event={"ID":"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a","Type":"ContainerStarted","Data":"fcc976f2108c4eb14ca04eab4b94437e35d4825d4dec983e0f1bd42e59682411"} Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.729257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.753687 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9dg5w" podStartSLOduration=2.378481024 podStartE2EDuration="3.753664895s" podCreationTimestamp="2026-02-26 20:01:01 +0000 UTC" firstStartedPulling="2026-02-26 20:01:02.68844012 +0000 UTC m=+405.225408044" lastFinishedPulling="2026-02-26 20:01:04.063623991 +0000 UTC m=+406.600591915" observedRunningTime="2026-02-26 20:01:04.748946835 +0000 UTC m=+407.285914769" watchObservedRunningTime="2026-02-26 20:01:04.753664895 +0000 UTC m=+407.290632829" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.766027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-catalog-content\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.766100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-utilities\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.766167 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2tmc\" (UniqueName: \"kubernetes.io/projected/704856f2-b29f-4fc8-8f18-a59104f507e9-kube-api-access-g2tmc\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.866964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-catalog-content\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.867402 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-utilities\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.867480 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2tmc\" (UniqueName: \"kubernetes.io/projected/704856f2-b29f-4fc8-8f18-a59104f507e9-kube-api-access-g2tmc\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.867791 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-catalog-content\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.868025 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-utilities\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.889368 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2tmc\" (UniqueName: \"kubernetes.io/projected/704856f2-b29f-4fc8-8f18-a59104f507e9-kube-api-access-g2tmc\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.923018 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.946520 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mklbp"] Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.161457 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8tbpk"] Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.726881 4722 generic.go:334] "Generic (PLEG): container finished" podID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerID="d41b83d978b9b4d79559a191aea1245600d05f7eb86575e2a7b748bbc06ea3bb" exitCode=0 Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.727427 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerDied","Data":"d41b83d978b9b4d79559a191aea1245600d05f7eb86575e2a7b748bbc06ea3bb"} Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.732428 4722 generic.go:334] "Generic (PLEG): container finished" podID="7f42259f-9c95-4fc1-af4a-711a171f8ea3" containerID="6a63e3ed0669331d88814c06c57ad501713e58fa550707cfa4db56b75998bf3f" exitCode=0 Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.732537 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mklbp" event={"ID":"7f42259f-9c95-4fc1-af4a-711a171f8ea3","Type":"ContainerDied","Data":"6a63e3ed0669331d88814c06c57ad501713e58fa550707cfa4db56b75998bf3f"} Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.732758 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mklbp" event={"ID":"7f42259f-9c95-4fc1-af4a-711a171f8ea3","Type":"ContainerStarted","Data":"ee9a9e05a98e399fc75d1727030f5c54fee33a9b2b702f66985ba8a2f58bc69d"} Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.739013 4722 generic.go:334] "Generic (PLEG): container finished" podID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerID="4bfc46d975d6a2fe85f799503e23d583e621d051ecf8db1005b076b08d316a77" exitCode=0 Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.739076 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerDied","Data":"4bfc46d975d6a2fe85f799503e23d583e621d051ecf8db1005b076b08d316a77"} Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.739183 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerStarted","Data":"f09ae6b96d1fe5926507b0c598918436485427770870540213f4409934bc8d64"} Feb 26 20:01:06 crc kubenswrapper[4722]: I0226 20:01:06.745230 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerStarted","Data":"f6426f570585139c98c6015be2cfcc6e9bfb02be324350403455ee8853d89f3f"} Feb 26 20:01:06 crc kubenswrapper[4722]: I0226 20:01:06.746864 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mklbp" event={"ID":"7f42259f-9c95-4fc1-af4a-711a171f8ea3","Type":"ContainerStarted","Data":"acb9c750850c9caab5e859ad2b2a3ad245b5c5bc023ad0f00a942a5cefeff7fa"} Feb 26 20:01:06 crc kubenswrapper[4722]: I0226 20:01:06.748503 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerStarted","Data":"54c86c10bac6d7c802a0ea18ff9bff59817ecb5ce79a933c8f7dcc0ba591dd41"} Feb 26 20:01:06 crc kubenswrapper[4722]: I0226 20:01:06.763731 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sj5r4" podStartSLOduration=3.323107206 podStartE2EDuration="5.763713101s" podCreationTimestamp="2026-02-26 20:01:01 +0000 UTC" firstStartedPulling="2026-02-26 20:01:03.699958302 +0000 UTC m=+406.236926226" lastFinishedPulling="2026-02-26 20:01:06.140564187 +0000 UTC m=+408.677532121" observedRunningTime="2026-02-26 20:01:06.760337588 +0000 UTC m=+409.297305512" watchObservedRunningTime="2026-02-26 20:01:06.763713101 +0000 UTC m=+409.300681025" Feb 26 20:01:07 crc kubenswrapper[4722]: I0226 20:01:07.756292 4722 generic.go:334] "Generic (PLEG): container finished" podID="7f42259f-9c95-4fc1-af4a-711a171f8ea3" containerID="acb9c750850c9caab5e859ad2b2a3ad245b5c5bc023ad0f00a942a5cefeff7fa" exitCode=0 Feb 26 20:01:07 crc kubenswrapper[4722]: I0226 20:01:07.756398 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mklbp" event={"ID":"7f42259f-9c95-4fc1-af4a-711a171f8ea3","Type":"ContainerDied","Data":"acb9c750850c9caab5e859ad2b2a3ad245b5c5bc023ad0f00a942a5cefeff7fa"} Feb 26 20:01:07 crc kubenswrapper[4722]: I0226 20:01:07.758441 4722 generic.go:334] "Generic (PLEG): container finished" podID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerID="54c86c10bac6d7c802a0ea18ff9bff59817ecb5ce79a933c8f7dcc0ba591dd41" exitCode=0 Feb 26 20:01:07 crc kubenswrapper[4722]: I0226 20:01:07.758503 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerDied","Data":"54c86c10bac6d7c802a0ea18ff9bff59817ecb5ce79a933c8f7dcc0ba591dd41"} Feb 26 20:01:08 crc kubenswrapper[4722]: I0226 20:01:08.764879 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mklbp" event={"ID":"7f42259f-9c95-4fc1-af4a-711a171f8ea3","Type":"ContainerStarted","Data":"2922e66887b37d8fde9239422acba3b62b23ab3d3bec7e392c696ae981049175"} Feb 26 20:01:08 crc kubenswrapper[4722]: I0226 20:01:08.766906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerStarted","Data":"ad33a1f4305c9dd234c51f33ac96ab77331ccb2eef9a4f1319f1f48c1029960e"} Feb 26 20:01:08 crc kubenswrapper[4722]: I0226 20:01:08.779588 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mklbp" podStartSLOduration=2.340928266 podStartE2EDuration="4.779568897s" podCreationTimestamp="2026-02-26 20:01:04 +0000 UTC" firstStartedPulling="2026-02-26 20:01:05.735776837 +0000 UTC m=+408.272744761" lastFinishedPulling="2026-02-26 20:01:08.174417468 +0000 UTC m=+410.711385392" observedRunningTime="2026-02-26 20:01:08.778762404 +0000 UTC m=+411.315730348" watchObservedRunningTime="2026-02-26 20:01:08.779568897 +0000 UTC m=+411.316536831" Feb 26 20:01:08 crc kubenswrapper[4722]: I0226 20:01:08.794078 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8tbpk" podStartSLOduration=2.338856559 podStartE2EDuration="4.794060625s" podCreationTimestamp="2026-02-26 20:01:04 +0000 UTC" firstStartedPulling="2026-02-26 20:01:05.740294062 +0000 UTC m=+408.277261996" lastFinishedPulling="2026-02-26 20:01:08.195498128 +0000 UTC m=+410.732466062" observedRunningTime="2026-02-26 20:01:08.792677567 +0000 UTC m=+411.329645481" watchObservedRunningTime="2026-02-26 20:01:08.794060625 +0000 UTC m=+411.331028559" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.190748 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.191104 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.227931 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.329970 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.330059 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.366879 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.822564 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.824326 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.729958 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.730302 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.771233 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.831556 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.923818 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.923868 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.961540 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:15 crc kubenswrapper[4722]: I0226 20:01:15.837275 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.797636 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-h8n29"] Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.798811 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.820594 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-h8n29"] Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960128 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-ca-trust-extracted\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960196 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-trusted-ca\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q59ph\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-kube-api-access-q59ph\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960244 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960269 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-registry-tls\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960373 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-registry-certificates\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960419 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-bound-sa-token\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960474 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-installation-pull-secrets\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.981171 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-trusted-ca\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061109 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q59ph\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-kube-api-access-q59ph\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061181 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-registry-tls\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061229 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-registry-certificates\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061254 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-bound-sa-token\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061273 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-installation-pull-secrets\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061297 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-ca-trust-extracted\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.062614 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-ca-trust-extracted\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.062677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-trusted-ca\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.062721 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-registry-certificates\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.067741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-installation-pull-secrets\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.067915 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-registry-tls\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.079269 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-bound-sa-token\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.082618 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q59ph\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-kube-api-access-q59ph\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.135256 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.489629 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.489960 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.541380 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-h8n29"] Feb 26 20:01:23 crc kubenswrapper[4722]: W0226 20:01:23.547781 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e56e5e8_e6bf_46c9_8087_d7e4af06e411.slice/crio-fb19e040273acabdd4a5ad8b79ded339ed778d5fbd2fb7c68df5838ae62bdb9e WatchSource:0}: Error finding container fb19e040273acabdd4a5ad8b79ded339ed778d5fbd2fb7c68df5838ae62bdb9e: Status 404 returned error can't find the container with id fb19e040273acabdd4a5ad8b79ded339ed778d5fbd2fb7c68df5838ae62bdb9e Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.848419 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" event={"ID":"3e56e5e8-e6bf-46c9-8087-d7e4af06e411","Type":"ContainerStarted","Data":"9d951d20d9a86bac6d1219c404d4d01cc106244d96d6f22398d7ef6818cea418"} Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.848465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" event={"ID":"3e56e5e8-e6bf-46c9-8087-d7e4af06e411","Type":"ContainerStarted","Data":"fb19e040273acabdd4a5ad8b79ded339ed778d5fbd2fb7c68df5838ae62bdb9e"} Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.848588 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.873333 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" podStartSLOduration=1.873315938 podStartE2EDuration="1.873315938s" podCreationTimestamp="2026-02-26 20:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:01:23.871709644 +0000 UTC m=+426.408677628" watchObservedRunningTime="2026-02-26 20:01:23.873315938 +0000 UTC m=+426.410283862" Feb 26 20:01:43 crc kubenswrapper[4722]: I0226 20:01:43.141780 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:43 crc kubenswrapper[4722]: I0226 20:01:43.228343 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fw46l"] Feb 26 20:01:53 crc kubenswrapper[4722]: I0226 20:01:53.487900 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:01:53 crc kubenswrapper[4722]: I0226 20:01:53.488381 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:01:53 crc kubenswrapper[4722]: I0226 20:01:53.488427 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:01:53 crc kubenswrapper[4722]: I0226 20:01:53.488930 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82183f43647e7ff3a4f2ec342cd25b593cbee0369ff7a2ece2747f71f5ba2d03"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:01:53 crc kubenswrapper[4722]: I0226 20:01:53.488981 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://82183f43647e7ff3a4f2ec342cd25b593cbee0369ff7a2ece2747f71f5ba2d03" gracePeriod=600 Feb 26 20:01:54 crc kubenswrapper[4722]: I0226 20:01:54.010603 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="82183f43647e7ff3a4f2ec342cd25b593cbee0369ff7a2ece2747f71f5ba2d03" exitCode=0 Feb 26 20:01:54 crc kubenswrapper[4722]: I0226 20:01:54.010712 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"82183f43647e7ff3a4f2ec342cd25b593cbee0369ff7a2ece2747f71f5ba2d03"} Feb 26 20:01:54 crc kubenswrapper[4722]: I0226 20:01:54.011019 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"8f8691f5d42ef337a84ad746773dcdfd71aecf3b13702ddd9fa1dda11224c081"} Feb 26 20:01:54 crc kubenswrapper[4722]: I0226 20:01:54.011045 4722 scope.go:117] "RemoveContainer" containerID="e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.130978 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535602-9ksgl"] Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.132322 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.134436 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.134711 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.136278 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.139670 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535602-9ksgl"] Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.143811 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2p54\" (UniqueName: \"kubernetes.io/projected/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66-kube-api-access-j2p54\") pod \"auto-csr-approver-29535602-9ksgl\" (UID: \"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66\") " pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.244754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2p54\" (UniqueName: \"kubernetes.io/projected/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66-kube-api-access-j2p54\") pod \"auto-csr-approver-29535602-9ksgl\" (UID: \"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66\") " pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.262219 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2p54\" (UniqueName: \"kubernetes.io/projected/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66-kube-api-access-j2p54\") pod \"auto-csr-approver-29535602-9ksgl\" (UID: \"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66\") " pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.457645 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.702383 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535602-9ksgl"] Feb 26 20:02:01 crc kubenswrapper[4722]: I0226 20:02:01.060954 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" event={"ID":"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66","Type":"ContainerStarted","Data":"8a8ba9c26b17e933828467b82f847d19f126de5cb6485d8fba7a57c4e39b805c"} Feb 26 20:02:02 crc kubenswrapper[4722]: I0226 20:02:02.069279 4722 generic.go:334] "Generic (PLEG): container finished" podID="cb5fc7ac-5083-4a8e-b290-a47ecd62ca66" containerID="5c490e51cd7a142717096d725e6c54df60bc8014504cb1037512fa976a9d7702" exitCode=0 Feb 26 20:02:02 crc kubenswrapper[4722]: I0226 20:02:02.069341 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" event={"ID":"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66","Type":"ContainerDied","Data":"5c490e51cd7a142717096d725e6c54df60bc8014504cb1037512fa976a9d7702"} Feb 26 20:02:03 crc kubenswrapper[4722]: I0226 20:02:03.313985 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:03 crc kubenswrapper[4722]: I0226 20:02:03.486408 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2p54\" (UniqueName: \"kubernetes.io/projected/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66-kube-api-access-j2p54\") pod \"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66\" (UID: \"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66\") " Feb 26 20:02:03 crc kubenswrapper[4722]: I0226 20:02:03.492381 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66-kube-api-access-j2p54" (OuterVolumeSpecName: "kube-api-access-j2p54") pod "cb5fc7ac-5083-4a8e-b290-a47ecd62ca66" (UID: "cb5fc7ac-5083-4a8e-b290-a47ecd62ca66"). InnerVolumeSpecName "kube-api-access-j2p54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:02:03 crc kubenswrapper[4722]: I0226 20:02:03.587867 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2p54\" (UniqueName: \"kubernetes.io/projected/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66-kube-api-access-j2p54\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:04 crc kubenswrapper[4722]: I0226 20:02:04.081998 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" event={"ID":"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66","Type":"ContainerDied","Data":"8a8ba9c26b17e933828467b82f847d19f126de5cb6485d8fba7a57c4e39b805c"} Feb 26 20:02:04 crc kubenswrapper[4722]: I0226 20:02:04.082070 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a8ba9c26b17e933828467b82f847d19f126de5cb6485d8fba7a57c4e39b805c" Feb 26 20:02:04 crc kubenswrapper[4722]: I0226 20:02:04.082107 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:04 crc kubenswrapper[4722]: I0226 20:02:04.363346 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535596-sfmpl"] Feb 26 20:02:04 crc kubenswrapper[4722]: I0226 20:02:04.371004 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535596-sfmpl"] Feb 26 20:02:06 crc kubenswrapper[4722]: I0226 20:02:06.157250 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" path="/var/lib/kubelet/pods/7c96e488-8450-4dff-ac4c-5ac9e210a9a6/volumes" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.265458 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" podUID="38bc8665-24b9-47b9-b7d2-0e45f55a0112" containerName="registry" containerID="cri-o://dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3" gracePeriod=30 Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.577397 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.758904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38bc8665-24b9-47b9-b7d2-0e45f55a0112-installation-pull-secrets\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.758983 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-tls\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759010 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njqxj\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-kube-api-access-njqxj\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759086 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38bc8665-24b9-47b9-b7d2-0e45f55a0112-ca-trust-extracted\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759125 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-bound-sa-token\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759196 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-trusted-ca\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759351 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759390 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-certificates\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759970 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.760630 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.763948 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38bc8665-24b9-47b9-b7d2-0e45f55a0112-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.764808 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-kube-api-access-njqxj" (OuterVolumeSpecName: "kube-api-access-njqxj") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "kube-api-access-njqxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.765229 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.765390 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.771122 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.776194 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38bc8665-24b9-47b9-b7d2-0e45f55a0112-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860798 4722 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38bc8665-24b9-47b9-b7d2-0e45f55a0112-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860844 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860858 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860871 4722 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860888 4722 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38bc8665-24b9-47b9-b7d2-0e45f55a0112-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860901 4722 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860912 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njqxj\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-kube-api-access-njqxj\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.113740 4722 generic.go:334] "Generic (PLEG): container finished" podID="38bc8665-24b9-47b9-b7d2-0e45f55a0112" containerID="dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3" exitCode=0 Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.113806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" event={"ID":"38bc8665-24b9-47b9-b7d2-0e45f55a0112","Type":"ContainerDied","Data":"dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3"} Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.113863 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.113892 4722 scope.go:117] "RemoveContainer" containerID="dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3" Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.113867 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" event={"ID":"38bc8665-24b9-47b9-b7d2-0e45f55a0112","Type":"ContainerDied","Data":"5b613cb39b5bcd5c7a499190105759fdfd8d946463c6f500054844f082aa192b"} Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.140428 4722 scope.go:117] "RemoveContainer" containerID="dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3" Feb 26 20:02:09 crc kubenswrapper[4722]: E0226 20:02:09.140922 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3\": container with ID starting with dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3 not found: ID does not exist" containerID="dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3" Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.140962 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3"} err="failed to get container status \"dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3\": rpc error: code = NotFound desc = could not find container \"dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3\": container with ID starting with dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3 not found: ID does not exist" Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.147607 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fw46l"] Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.151805 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fw46l"] Feb 26 20:02:10 crc kubenswrapper[4722]: I0226 20:02:10.156614 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38bc8665-24b9-47b9-b7d2-0e45f55a0112" path="/var/lib/kubelet/pods/38bc8665-24b9-47b9-b7d2-0e45f55a0112/volumes" Feb 26 20:03:53 crc kubenswrapper[4722]: I0226 20:03:53.487892 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:03:53 crc kubenswrapper[4722]: I0226 20:03:53.488460 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.138906 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535604-xtrhk"] Feb 26 20:04:00 crc kubenswrapper[4722]: E0226 20:04:00.139796 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bc8665-24b9-47b9-b7d2-0e45f55a0112" containerName="registry" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.139811 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bc8665-24b9-47b9-b7d2-0e45f55a0112" containerName="registry" Feb 26 20:04:00 crc kubenswrapper[4722]: E0226 20:04:00.139830 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb5fc7ac-5083-4a8e-b290-a47ecd62ca66" containerName="oc" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.139836 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb5fc7ac-5083-4a8e-b290-a47ecd62ca66" containerName="oc" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.139951 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb5fc7ac-5083-4a8e-b290-a47ecd62ca66" containerName="oc" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.139966 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="38bc8665-24b9-47b9-b7d2-0e45f55a0112" containerName="registry" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.140412 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.142679 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.142731 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.142706 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.157242 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535604-xtrhk"] Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.299474 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrxbr\" (UniqueName: \"kubernetes.io/projected/c1a0b333-4923-4483-b110-ea7109c80c67-kube-api-access-lrxbr\") pod \"auto-csr-approver-29535604-xtrhk\" (UID: \"c1a0b333-4923-4483-b110-ea7109c80c67\") " pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.400386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrxbr\" (UniqueName: \"kubernetes.io/projected/c1a0b333-4923-4483-b110-ea7109c80c67-kube-api-access-lrxbr\") pod \"auto-csr-approver-29535604-xtrhk\" (UID: \"c1a0b333-4923-4483-b110-ea7109c80c67\") " pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.425507 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrxbr\" (UniqueName: \"kubernetes.io/projected/c1a0b333-4923-4483-b110-ea7109c80c67-kube-api-access-lrxbr\") pod \"auto-csr-approver-29535604-xtrhk\" (UID: \"c1a0b333-4923-4483-b110-ea7109c80c67\") " pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.468441 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.876200 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535604-xtrhk"] Feb 26 20:04:00 crc kubenswrapper[4722]: W0226 20:04:00.888242 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1a0b333_4923_4483_b110_ea7109c80c67.slice/crio-6cf54759c33616070aacc21062c2eb03e7a7c24ae2d39d4cfb8b6c2df8e43a14 WatchSource:0}: Error finding container 6cf54759c33616070aacc21062c2eb03e7a7c24ae2d39d4cfb8b6c2df8e43a14: Status 404 returned error can't find the container with id 6cf54759c33616070aacc21062c2eb03e7a7c24ae2d39d4cfb8b6c2df8e43a14 Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.890924 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:04:01 crc kubenswrapper[4722]: I0226 20:04:01.755423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" event={"ID":"c1a0b333-4923-4483-b110-ea7109c80c67","Type":"ContainerStarted","Data":"6cf54759c33616070aacc21062c2eb03e7a7c24ae2d39d4cfb8b6c2df8e43a14"} Feb 26 20:04:02 crc kubenswrapper[4722]: I0226 20:04:02.765625 4722 generic.go:334] "Generic (PLEG): container finished" podID="c1a0b333-4923-4483-b110-ea7109c80c67" containerID="45dcb0f1668265fe8e719cd4acb2ecb42b8c96958fcf0c875af8011f92fb6974" exitCode=0 Feb 26 20:04:02 crc kubenswrapper[4722]: I0226 20:04:02.765674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" event={"ID":"c1a0b333-4923-4483-b110-ea7109c80c67","Type":"ContainerDied","Data":"45dcb0f1668265fe8e719cd4acb2ecb42b8c96958fcf0c875af8011f92fb6974"} Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.025013 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.145846 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrxbr\" (UniqueName: \"kubernetes.io/projected/c1a0b333-4923-4483-b110-ea7109c80c67-kube-api-access-lrxbr\") pod \"c1a0b333-4923-4483-b110-ea7109c80c67\" (UID: \"c1a0b333-4923-4483-b110-ea7109c80c67\") " Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.152664 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a0b333-4923-4483-b110-ea7109c80c67-kube-api-access-lrxbr" (OuterVolumeSpecName: "kube-api-access-lrxbr") pod "c1a0b333-4923-4483-b110-ea7109c80c67" (UID: "c1a0b333-4923-4483-b110-ea7109c80c67"). InnerVolumeSpecName "kube-api-access-lrxbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.247422 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrxbr\" (UniqueName: \"kubernetes.io/projected/c1a0b333-4923-4483-b110-ea7109c80c67-kube-api-access-lrxbr\") on node \"crc\" DevicePath \"\"" Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.790341 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" event={"ID":"c1a0b333-4923-4483-b110-ea7109c80c67","Type":"ContainerDied","Data":"6cf54759c33616070aacc21062c2eb03e7a7c24ae2d39d4cfb8b6c2df8e43a14"} Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.790386 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cf54759c33616070aacc21062c2eb03e7a7c24ae2d39d4cfb8b6c2df8e43a14" Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.790442 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:05 crc kubenswrapper[4722]: I0226 20:04:05.094800 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535598-7j7jd"] Feb 26 20:04:05 crc kubenswrapper[4722]: I0226 20:04:05.100666 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535598-7j7jd"] Feb 26 20:04:06 crc kubenswrapper[4722]: I0226 20:04:06.158950 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="452039e5-ebab-456a-8ca8-045fa1b1c90a" path="/var/lib/kubelet/pods/452039e5-ebab-456a-8ca8-045fa1b1c90a/volumes" Feb 26 20:04:18 crc kubenswrapper[4722]: I0226 20:04:18.453511 4722 scope.go:117] "RemoveContainer" containerID="fb06b6a4a4e3e22645700d3309b4c72bcd90ed6360064e58d65677c1d2426349" Feb 26 20:04:23 crc kubenswrapper[4722]: I0226 20:04:23.487109 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:04:23 crc kubenswrapper[4722]: I0226 20:04:23.487469 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:04:53 crc kubenswrapper[4722]: I0226 20:04:53.487847 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:04:53 crc kubenswrapper[4722]: I0226 20:04:53.488371 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:04:53 crc kubenswrapper[4722]: I0226 20:04:53.488414 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:04:53 crc kubenswrapper[4722]: I0226 20:04:53.488886 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f8691f5d42ef337a84ad746773dcdfd71aecf3b13702ddd9fa1dda11224c081"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:04:53 crc kubenswrapper[4722]: I0226 20:04:53.488938 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://8f8691f5d42ef337a84ad746773dcdfd71aecf3b13702ddd9fa1dda11224c081" gracePeriod=600 Feb 26 20:04:54 crc kubenswrapper[4722]: I0226 20:04:54.087585 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="8f8691f5d42ef337a84ad746773dcdfd71aecf3b13702ddd9fa1dda11224c081" exitCode=0 Feb 26 20:04:54 crc kubenswrapper[4722]: I0226 20:04:54.087677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"8f8691f5d42ef337a84ad746773dcdfd71aecf3b13702ddd9fa1dda11224c081"} Feb 26 20:04:54 crc kubenswrapper[4722]: I0226 20:04:54.088115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"12e92002147a6bed28558e812784c0c72814bfcf24c4c83a3ce08703dfb08d58"} Feb 26 20:04:54 crc kubenswrapper[4722]: I0226 20:04:54.088152 4722 scope.go:117] "RemoveContainer" containerID="82183f43647e7ff3a4f2ec342cd25b593cbee0369ff7a2ece2747f71f5ba2d03" Feb 26 20:05:08 crc kubenswrapper[4722]: I0226 20:05:08.905731 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb"] Feb 26 20:05:08 crc kubenswrapper[4722]: E0226 20:05:08.906640 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a0b333-4923-4483-b110-ea7109c80c67" containerName="oc" Feb 26 20:05:08 crc kubenswrapper[4722]: I0226 20:05:08.906657 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a0b333-4923-4483-b110-ea7109c80c67" containerName="oc" Feb 26 20:05:08 crc kubenswrapper[4722]: I0226 20:05:08.906783 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1a0b333-4923-4483-b110-ea7109c80c67" containerName="oc" Feb 26 20:05:08 crc kubenswrapper[4722]: I0226 20:05:08.907781 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:08 crc kubenswrapper[4722]: I0226 20:05:08.911051 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 20:05:08 crc kubenswrapper[4722]: I0226 20:05:08.917066 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb"] Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.079952 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.080035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjghn\" (UniqueName: \"kubernetes.io/projected/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-kube-api-access-fjghn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.080104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.180801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.181273 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.181440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.181598 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjghn\" (UniqueName: \"kubernetes.io/projected/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-kube-api-access-fjghn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.181698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.204009 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjghn\" (UniqueName: \"kubernetes.io/projected/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-kube-api-access-fjghn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.230661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.430194 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb"] Feb 26 20:05:10 crc kubenswrapper[4722]: I0226 20:05:10.179454 4722 generic.go:334] "Generic (PLEG): container finished" podID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerID="3c477bc44da2cb4aa81eb48a867b7365edc0ae3beed67470d432053057585289" exitCode=0 Feb 26 20:05:10 crc kubenswrapper[4722]: I0226 20:05:10.179493 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" event={"ID":"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec","Type":"ContainerDied","Data":"3c477bc44da2cb4aa81eb48a867b7365edc0ae3beed67470d432053057585289"} Feb 26 20:05:10 crc kubenswrapper[4722]: I0226 20:05:10.179531 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" event={"ID":"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec","Type":"ContainerStarted","Data":"7e7683e54d656331f8d02f25b5e02dcd13ae8436619d073b22aa69c83dece9b6"} Feb 26 20:05:12 crc kubenswrapper[4722]: I0226 20:05:12.189840 4722 generic.go:334] "Generic (PLEG): container finished" podID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerID="20be98c7eb837e6ae7f92c358ebe6d9f5b88fb8804d88f30f8699ce27e9ceac3" exitCode=0 Feb 26 20:05:12 crc kubenswrapper[4722]: I0226 20:05:12.189944 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" event={"ID":"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec","Type":"ContainerDied","Data":"20be98c7eb837e6ae7f92c358ebe6d9f5b88fb8804d88f30f8699ce27e9ceac3"} Feb 26 20:05:13 crc kubenswrapper[4722]: I0226 20:05:13.195728 4722 generic.go:334] "Generic (PLEG): container finished" podID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerID="167aa5927473cd1c53e38f8cae652cee644a1c8f8c5dd7799febe145348216c1" exitCode=0 Feb 26 20:05:13 crc kubenswrapper[4722]: I0226 20:05:13.195766 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" event={"ID":"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec","Type":"ContainerDied","Data":"167aa5927473cd1c53e38f8cae652cee644a1c8f8c5dd7799febe145348216c1"} Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.401479 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.561054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-util\") pod \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.561229 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjghn\" (UniqueName: \"kubernetes.io/projected/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-kube-api-access-fjghn\") pod \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.561267 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-bundle\") pod \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.563727 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-bundle" (OuterVolumeSpecName: "bundle") pod "daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" (UID: "daf4e96e-bfb6-45a4-be04-1c92dd2b6eec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.567250 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-kube-api-access-fjghn" (OuterVolumeSpecName: "kube-api-access-fjghn") pod "daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" (UID: "daf4e96e-bfb6-45a4-be04-1c92dd2b6eec"). InnerVolumeSpecName "kube-api-access-fjghn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.578217 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-util" (OuterVolumeSpecName: "util") pod "daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" (UID: "daf4e96e-bfb6-45a4-be04-1c92dd2b6eec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.662694 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-util\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.662730 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjghn\" (UniqueName: \"kubernetes.io/projected/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-kube-api-access-fjghn\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.662741 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:15 crc kubenswrapper[4722]: I0226 20:05:15.208607 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" event={"ID":"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec","Type":"ContainerDied","Data":"7e7683e54d656331f8d02f25b5e02dcd13ae8436619d073b22aa69c83dece9b6"} Feb 26 20:05:15 crc kubenswrapper[4722]: I0226 20:05:15.208653 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7683e54d656331f8d02f25b5e02dcd13ae8436619d073b22aa69c83dece9b6" Feb 26 20:05:15 crc kubenswrapper[4722]: I0226 20:05:15.208652 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:18 crc kubenswrapper[4722]: I0226 20:05:18.497958 4722 scope.go:117] "RemoveContainer" containerID="038d57052d50b4d9f98e827126cdbdf049580d5bca8e9f8a10f570e84904b7ef" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.143202 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bqmjx"] Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.143950 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-controller" containerID="cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.144345 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="northd" containerID="cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.144369 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="sbdb" containerID="cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.144440 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="nbdb" containerID="cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.144508 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.144516 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-acl-logging" containerID="cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.144511 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-node" containerID="cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.242496 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" containerID="cri-o://e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.551872 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/3.log" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.557403 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovn-acl-logging/0.log" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.557789 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovn-controller/0.log" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.558098 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634206 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lc7x7"] Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634463 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="extract" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634484 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="extract" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634498 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634506 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634522 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-node" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634529 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-node" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634537 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kubecfg-setup" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634545 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kubecfg-setup" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634554 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="sbdb" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634561 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="sbdb" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634572 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="northd" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634579 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="northd" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634587 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634597 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634607 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634616 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634626 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634633 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634643 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="util" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634650 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="util" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634662 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="pull" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634669 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="pull" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634679 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="nbdb" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634686 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="nbdb" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634698 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634705 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634714 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-acl-logging" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634721 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-acl-logging" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634731 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634738 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634844 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="nbdb" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634858 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-node" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634868 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="sbdb" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634877 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="extract" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634886 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634897 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634906 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="northd" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634920 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634929 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634938 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634947 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634956 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-acl-logging" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.635068 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.635078 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.635198 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.637160 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638119 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638173 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-kubelet\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638205 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-systemd-units\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638242 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-systemd\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638268 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-etc-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638289 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-cni-bin\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638321 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-run-ovn-kubernetes\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638349 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-slash\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638372 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-var-lib-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-node-log\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638416 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-ovn\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638445 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-run-netns\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638478 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-log-socket\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-ovn-kubernetes\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739325 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-openvswitch\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739358 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-systemd-units\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739386 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-script-lib\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739404 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-netns\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739427 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-bin\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739452 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdlkp\" (UniqueName: \"kubernetes.io/projected/110fea1c-1463-40d7-bb4b-1825d5b706f0-kube-api-access-vdlkp\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739470 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-ovn\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739496 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-systemd\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739511 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-etc-openvswitch\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739535 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-kubelet\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739550 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-netd\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739570 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-var-lib-openvswitch\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739587 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovn-node-metrics-cert\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739602 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-slash\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739635 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-node-log\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739656 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-env-overrides\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-log-socket\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739696 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-config\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739803 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-ovnkube-script-lib\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739827 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-systemd\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739846 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-etc-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-cni-bin\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739883 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-run-ovn-kubernetes\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739901 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-env-overrides\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739920 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-slash\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739940 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-var-lib-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740424 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740455 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740474 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740696 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-node-log\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740855 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740874 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-ovnkube-config\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-ovn\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkfbc\" (UniqueName: \"kubernetes.io/projected/5988c6cd-df65-4e25-a262-45335d20144e-kube-api-access-jkfbc\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-run-netns\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741467 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-log-socket\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-cni-netd\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-kubelet\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741867 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5988c6cd-df65-4e25-a262-45335d20144e-ovn-node-metrics-cert\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742047 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-systemd-units\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742234 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742321 4722 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742408 4722 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742317 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742351 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742607 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742697 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-node-log\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742806 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-ovn\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742884 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742969 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-run-netns\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743055 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-log-socket\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743165 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743184 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743230 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743250 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-log-socket" (OuterVolumeSpecName: "log-socket") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743255 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-slash" (OuterVolumeSpecName: "host-slash") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743475 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-node-log" (OuterVolumeSpecName: "node-log") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743501 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743434 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-systemd-units\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743456 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-slash\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-var-lib-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743227 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-etc-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743359 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-cni-bin\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743316 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743339 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-kubelet\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743384 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-run-ovn-kubernetes\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-systemd\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.745988 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/110fea1c-1463-40d7-bb4b-1825d5b706f0-kube-api-access-vdlkp" (OuterVolumeSpecName: "kube-api-access-vdlkp") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "kube-api-access-vdlkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.746124 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.756041 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843563 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkfbc\" (UniqueName: \"kubernetes.io/projected/5988c6cd-df65-4e25-a262-45335d20144e-kube-api-access-jkfbc\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843634 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-cni-netd\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843665 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843684 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5988c6cd-df65-4e25-a262-45335d20144e-ovn-node-metrics-cert\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-ovnkube-script-lib\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843737 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-env-overrides\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843740 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843756 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-ovnkube-config\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843748 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-cni-netd\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844153 4722 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844172 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdlkp\" (UniqueName: \"kubernetes.io/projected/110fea1c-1463-40d7-bb4b-1825d5b706f0-kube-api-access-vdlkp\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844183 4722 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844193 4722 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844202 4722 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844211 4722 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844220 4722 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844228 4722 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844236 4722 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-slash\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844245 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844265 4722 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844277 4722 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-node-log\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844286 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844295 4722 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-log-socket\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844303 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844312 4722 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844320 4722 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844361 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-env-overrides\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.845065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-ovnkube-config\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.845097 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-ovnkube-script-lib\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.848309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5988c6cd-df65-4e25-a262-45335d20144e-ovn-node-metrics-cert\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.859709 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkfbc\" (UniqueName: \"kubernetes.io/projected/5988c6cd-df65-4e25-a262-45335d20144e-kube-api-access-jkfbc\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.952786 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.237978 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/3.log" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.240284 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovn-acl-logging/0.log" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.240788 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovn-controller/0.log" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241158 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241181 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241190 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241214 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241221 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241244 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241255 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" exitCode=143 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241262 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" exitCode=143 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241240 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241365 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241374 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241377 4722 scope.go:117] "RemoveContainer" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241384 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241542 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241553 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241559 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241564 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241569 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241574 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241578 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241583 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241588 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241602 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241609 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241614 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241619 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241624 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241629 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241634 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241638 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241643 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241647 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241662 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241667 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241673 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241678 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241683 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241689 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241696 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241701 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241706 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241712 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241719 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"4fce7b880d678b13609fc703e455012610c169055f8523bb5981b30b1c777cbe"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241727 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241734 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241740 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241745 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241750 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241757 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241763 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241768 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241773 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241778 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.242806 4722 generic.go:334] "Generic (PLEG): container finished" podID="5988c6cd-df65-4e25-a262-45335d20144e" containerID="89692049c18aaf9586b19ca74ab183f8c1fb87ffe582d8d8e227f7360f3ff8f6" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.242886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerDied","Data":"89692049c18aaf9586b19ca74ab183f8c1fb87ffe582d8d8e227f7360f3ff8f6"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.242919 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"0541eb30896e74d229c27439ec12c4d9ce54327ade4e94e14154539f53dc6609"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.244345 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.244585 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/1.log" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.244974 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/0.log" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.245000 4722 generic.go:334] "Generic (PLEG): container finished" podID="2bb99326-dd22-4186-84da-ba208f104cd6" containerID="9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097" exitCode=2 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.245018 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cfwh9" event={"ID":"2bb99326-dd22-4186-84da-ba208f104cd6","Type":"ContainerDied","Data":"9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.245030 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.245285 4722 scope.go:117] "RemoveContainer" containerID="9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.245423 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-cfwh9_openshift-multus(2bb99326-dd22-4186-84da-ba208f104cd6)\"" pod="openshift-multus/multus-cfwh9" podUID="2bb99326-dd22-4186-84da-ba208f104cd6" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.261302 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.306885 4722 scope.go:117] "RemoveContainer" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.323259 4722 scope.go:117] "RemoveContainer" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.345180 4722 scope.go:117] "RemoveContainer" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.351377 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bqmjx"] Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.370402 4722 scope.go:117] "RemoveContainer" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.386907 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bqmjx"] Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.397520 4722 scope.go:117] "RemoveContainer" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.418820 4722 scope.go:117] "RemoveContainer" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.440831 4722 scope.go:117] "RemoveContainer" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.488089 4722 scope.go:117] "RemoveContainer" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.537301 4722 scope.go:117] "RemoveContainer" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.541262 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": container with ID starting with e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3 not found: ID does not exist" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.541312 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} err="failed to get container status \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": rpc error: code = NotFound desc = could not find container \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": container with ID starting with e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.541341 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.541691 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": container with ID starting with 3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6 not found: ID does not exist" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.541725 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} err="failed to get container status \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": rpc error: code = NotFound desc = could not find container \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": container with ID starting with 3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.541748 4722 scope.go:117] "RemoveContainer" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.542055 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": container with ID starting with ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591 not found: ID does not exist" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.542073 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} err="failed to get container status \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": rpc error: code = NotFound desc = could not find container \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": container with ID starting with ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.542086 4722 scope.go:117] "RemoveContainer" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.542394 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": container with ID starting with dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab not found: ID does not exist" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.542415 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} err="failed to get container status \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": rpc error: code = NotFound desc = could not find container \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": container with ID starting with dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.542822 4722 scope.go:117] "RemoveContainer" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.543079 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": container with ID starting with 4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b not found: ID does not exist" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543097 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} err="failed to get container status \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": rpc error: code = NotFound desc = could not find container \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": container with ID starting with 4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543109 4722 scope.go:117] "RemoveContainer" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.543299 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": container with ID starting with 08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3 not found: ID does not exist" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543316 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} err="failed to get container status \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": rpc error: code = NotFound desc = could not find container \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": container with ID starting with 08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543328 4722 scope.go:117] "RemoveContainer" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.543524 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": container with ID starting with a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485 not found: ID does not exist" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543546 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} err="failed to get container status \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": rpc error: code = NotFound desc = could not find container \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": container with ID starting with a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543578 4722 scope.go:117] "RemoveContainer" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.543791 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": container with ID starting with 9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c not found: ID does not exist" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543860 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} err="failed to get container status \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": rpc error: code = NotFound desc = could not find container \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": container with ID starting with 9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543922 4722 scope.go:117] "RemoveContainer" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.544128 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": container with ID starting with 9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937 not found: ID does not exist" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.544378 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} err="failed to get container status \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": rpc error: code = NotFound desc = could not find container \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": container with ID starting with 9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.544437 4722 scope.go:117] "RemoveContainer" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.544668 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": container with ID starting with 0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9 not found: ID does not exist" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.544742 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} err="failed to get container status \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": rpc error: code = NotFound desc = could not find container \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": container with ID starting with 0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.544802 4722 scope.go:117] "RemoveContainer" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.545328 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} err="failed to get container status \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": rpc error: code = NotFound desc = could not find container \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": container with ID starting with e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.545410 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.545616 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} err="failed to get container status \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": rpc error: code = NotFound desc = could not find container \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": container with ID starting with 3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.545682 4722 scope.go:117] "RemoveContainer" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.545890 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} err="failed to get container status \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": rpc error: code = NotFound desc = could not find container \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": container with ID starting with ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.545955 4722 scope.go:117] "RemoveContainer" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.546184 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} err="failed to get container status \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": rpc error: code = NotFound desc = could not find container \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": container with ID starting with dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.546250 4722 scope.go:117] "RemoveContainer" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.546519 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} err="failed to get container status \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": rpc error: code = NotFound desc = could not find container \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": container with ID starting with 4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.546589 4722 scope.go:117] "RemoveContainer" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.546780 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} err="failed to get container status \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": rpc error: code = NotFound desc = could not find container \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": container with ID starting with 08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.546888 4722 scope.go:117] "RemoveContainer" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.547089 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} err="failed to get container status \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": rpc error: code = NotFound desc = could not find container \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": container with ID starting with a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.547185 4722 scope.go:117] "RemoveContainer" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.547896 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} err="failed to get container status \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": rpc error: code = NotFound desc = could not find container \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": container with ID starting with 9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.547933 4722 scope.go:117] "RemoveContainer" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.548711 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} err="failed to get container status \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": rpc error: code = NotFound desc = could not find container \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": container with ID starting with 9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.548740 4722 scope.go:117] "RemoveContainer" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.552572 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} err="failed to get container status \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": rpc error: code = NotFound desc = could not find container \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": container with ID starting with 0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.552629 4722 scope.go:117] "RemoveContainer" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.555127 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} err="failed to get container status \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": rpc error: code = NotFound desc = could not find container \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": container with ID starting with e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.555173 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.559454 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} err="failed to get container status \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": rpc error: code = NotFound desc = could not find container \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": container with ID starting with 3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.559547 4722 scope.go:117] "RemoveContainer" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.563096 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} err="failed to get container status \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": rpc error: code = NotFound desc = could not find container \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": container with ID starting with ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.563157 4722 scope.go:117] "RemoveContainer" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.567131 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} err="failed to get container status \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": rpc error: code = NotFound desc = could not find container \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": container with ID starting with dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.567178 4722 scope.go:117] "RemoveContainer" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.568218 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} err="failed to get container status \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": rpc error: code = NotFound desc = could not find container \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": container with ID starting with 4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.568238 4722 scope.go:117] "RemoveContainer" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.571212 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} err="failed to get container status \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": rpc error: code = NotFound desc = could not find container \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": container with ID starting with 08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.571253 4722 scope.go:117] "RemoveContainer" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.571484 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} err="failed to get container status \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": rpc error: code = NotFound desc = could not find container \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": container with ID starting with a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.571502 4722 scope.go:117] "RemoveContainer" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.571768 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} err="failed to get container status \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": rpc error: code = NotFound desc = could not find container \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": container with ID starting with 9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.571785 4722 scope.go:117] "RemoveContainer" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.572074 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} err="failed to get container status \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": rpc error: code = NotFound desc = could not find container \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": container with ID starting with 9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.572090 4722 scope.go:117] "RemoveContainer" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.573009 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} err="failed to get container status \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": rpc error: code = NotFound desc = could not find container \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": container with ID starting with 0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.573037 4722 scope.go:117] "RemoveContainer" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.573553 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} err="failed to get container status \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": rpc error: code = NotFound desc = could not find container \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": container with ID starting with e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.573569 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.574026 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} err="failed to get container status \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": rpc error: code = NotFound desc = could not find container \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": container with ID starting with 3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.574042 4722 scope.go:117] "RemoveContainer" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.582526 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} err="failed to get container status \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": rpc error: code = NotFound desc = could not find container \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": container with ID starting with ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.582583 4722 scope.go:117] "RemoveContainer" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.582993 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} err="failed to get container status \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": rpc error: code = NotFound desc = could not find container \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": container with ID starting with dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583008 4722 scope.go:117] "RemoveContainer" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583247 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} err="failed to get container status \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": rpc error: code = NotFound desc = could not find container \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": container with ID starting with 4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583261 4722 scope.go:117] "RemoveContainer" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583516 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} err="failed to get container status \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": rpc error: code = NotFound desc = could not find container \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": container with ID starting with 08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583578 4722 scope.go:117] "RemoveContainer" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583855 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} err="failed to get container status \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": rpc error: code = NotFound desc = could not find container \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": container with ID starting with a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583869 4722 scope.go:117] "RemoveContainer" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.584072 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} err="failed to get container status \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": rpc error: code = NotFound desc = could not find container \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": container with ID starting with 9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.584084 4722 scope.go:117] "RemoveContainer" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.584472 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} err="failed to get container status \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": rpc error: code = NotFound desc = could not find container \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": container with ID starting with 9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.584491 4722 scope.go:117] "RemoveContainer" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.585185 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} err="failed to get container status \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": rpc error: code = NotFound desc = could not find container \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": container with ID starting with 0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.585200 4722 scope.go:117] "RemoveContainer" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.585397 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} err="failed to get container status \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": rpc error: code = NotFound desc = could not find container \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": container with ID starting with e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3 not found: ID does not exist" Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.153859 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" path="/var/lib/kubelet/pods/110fea1c-1463-40d7-bb4b-1825d5b706f0/volumes" Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.252937 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"a08bd02f6a29b1061f8f8bbbb29b700d5287b70a47cae02b599b875d35c141dc"} Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.253994 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"eb104e5669595b8095c013dd27cda30813561a94fda0aff3a1574350d0fd1990"} Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.254074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"74588e99ba305f11d05fd5bdae58b2e82892e37a23a28b8eb2a2b81513b306bc"} Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.254130 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"512c4ca1e36f8e1db30bc7b3eff0cd7edaa3f19e067a1e0131f14e0e24feb6be"} Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.254206 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"366ccdcc21e573f007415c7297b8582e325b2017efee3f69f6907d0d7ba3e4c5"} Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.254269 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"606d03aeb4229569c4c9d5112a36dbf1fd28e7a07fb020dc904d11eb532470ef"} Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.868069 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4"] Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.868845 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.872525 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xq4j\" (UniqueName: \"kubernetes.io/projected/edddb923-4396-43c9-880a-ed3ac0215808-kube-api-access-8xq4j\") pod \"obo-prometheus-operator-68bc856cb9-2rgq4\" (UID: \"edddb923-4396-43c9-880a-ed3ac0215808\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.872605 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-dcch2" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.872910 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.880872 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.973489 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xq4j\" (UniqueName: \"kubernetes.io/projected/edddb923-4396-43c9-880a-ed3ac0215808-kube-api-access-8xq4j\") pod \"obo-prometheus-operator-68bc856cb9-2rgq4\" (UID: \"edddb923-4396-43c9-880a-ed3ac0215808\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.985904 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp"] Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.986564 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.988445 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.988626 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-9kqrw" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:23.993502 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xq4j\" (UniqueName: \"kubernetes.io/projected/edddb923-4396-43c9-880a-ed3ac0215808-kube-api-access-8xq4j\") pod \"obo-prometheus-operator-68bc856cb9-2rgq4\" (UID: \"edddb923-4396-43c9-880a-ed3ac0215808\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.021834 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr"] Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.022634 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.121343 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-bmtvj"] Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.122201 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.123943 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.124078 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-c72h5" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.176339 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dde658b6-956e-4b8c-86b6-e707bfcc0dbf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp\" (UID: \"dde658b6-956e-4b8c-86b6-e707bfcc0dbf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.176404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/419eee0b-c988-42e3-af4f-cef110425bb3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr\" (UID: \"419eee0b-c988-42e3-af4f-cef110425bb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.176436 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dde658b6-956e-4b8c-86b6-e707bfcc0dbf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp\" (UID: \"dde658b6-956e-4b8c-86b6-e707bfcc0dbf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.176452 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b61de85a-5167-4af3-b14b-993cb20559fa-observability-operator-tls\") pod \"observability-operator-59bdc8b94-bmtvj\" (UID: \"b61de85a-5167-4af3-b14b-993cb20559fa\") " pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.176466 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/419eee0b-c988-42e3-af4f-cef110425bb3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr\" (UID: \"419eee0b-c988-42e3-af4f-cef110425bb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.176482 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6lbk\" (UniqueName: \"kubernetes.io/projected/b61de85a-5167-4af3-b14b-993cb20559fa-kube-api-access-z6lbk\") pod \"observability-operator-59bdc8b94-bmtvj\" (UID: \"b61de85a-5167-4af3-b14b-993cb20559fa\") " pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.185804 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.214003 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(da07e5b9fbc978bd53df0af9c7bb75be1e50caf4d07f88677d2cf68b808db4ff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.214067 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(da07e5b9fbc978bd53df0af9c7bb75be1e50caf4d07f88677d2cf68b808db4ff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.214092 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(da07e5b9fbc978bd53df0af9c7bb75be1e50caf4d07f88677d2cf68b808db4ff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.214187 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators(edddb923-4396-43c9-880a-ed3ac0215808)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators(edddb923-4396-43c9-880a-ed3ac0215808)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(da07e5b9fbc978bd53df0af9c7bb75be1e50caf4d07f88677d2cf68b808db4ff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" podUID="edddb923-4396-43c9-880a-ed3ac0215808" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.268077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"9e085c73b6c4a092ad217bdb91f741264fbe0c2845dd96e38296ed12b7cb4ae7"} Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.277705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dde658b6-956e-4b8c-86b6-e707bfcc0dbf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp\" (UID: \"dde658b6-956e-4b8c-86b6-e707bfcc0dbf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.277781 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/419eee0b-c988-42e3-af4f-cef110425bb3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr\" (UID: \"419eee0b-c988-42e3-af4f-cef110425bb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.277886 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dde658b6-956e-4b8c-86b6-e707bfcc0dbf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp\" (UID: \"dde658b6-956e-4b8c-86b6-e707bfcc0dbf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.277918 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b61de85a-5167-4af3-b14b-993cb20559fa-observability-operator-tls\") pod \"observability-operator-59bdc8b94-bmtvj\" (UID: \"b61de85a-5167-4af3-b14b-993cb20559fa\") " pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.277939 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/419eee0b-c988-42e3-af4f-cef110425bb3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr\" (UID: \"419eee0b-c988-42e3-af4f-cef110425bb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.277956 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6lbk\" (UniqueName: \"kubernetes.io/projected/b61de85a-5167-4af3-b14b-993cb20559fa-kube-api-access-z6lbk\") pod \"observability-operator-59bdc8b94-bmtvj\" (UID: \"b61de85a-5167-4af3-b14b-993cb20559fa\") " pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.281882 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b61de85a-5167-4af3-b14b-993cb20559fa-observability-operator-tls\") pod \"observability-operator-59bdc8b94-bmtvj\" (UID: \"b61de85a-5167-4af3-b14b-993cb20559fa\") " pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.283708 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dde658b6-956e-4b8c-86b6-e707bfcc0dbf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp\" (UID: \"dde658b6-956e-4b8c-86b6-e707bfcc0dbf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.295908 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dde658b6-956e-4b8c-86b6-e707bfcc0dbf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp\" (UID: \"dde658b6-956e-4b8c-86b6-e707bfcc0dbf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.296619 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/419eee0b-c988-42e3-af4f-cef110425bb3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr\" (UID: \"419eee0b-c988-42e3-af4f-cef110425bb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.298804 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6lbk\" (UniqueName: \"kubernetes.io/projected/b61de85a-5167-4af3-b14b-993cb20559fa-kube-api-access-z6lbk\") pod \"observability-operator-59bdc8b94-bmtvj\" (UID: \"b61de85a-5167-4af3-b14b-993cb20559fa\") " pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.299572 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/419eee0b-c988-42e3-af4f-cef110425bb3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr\" (UID: \"419eee0b-c988-42e3-af4f-cef110425bb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.314932 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tf59s"] Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.315551 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:24 crc kubenswrapper[4722]: W0226 20:05:24.316872 4722 reflector.go:561] object-"openshift-operators"/"perses-operator-dockercfg-rmk7l": failed to list *v1.Secret: secrets "perses-operator-dockercfg-rmk7l" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.316915 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"perses-operator-dockercfg-rmk7l\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"perses-operator-dockercfg-rmk7l\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.347267 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.355467 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.384853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5420a13-8c3b-45fa-9c99-a796202b11d9-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tf59s\" (UID: \"c5420a13-8c3b-45fa-9c99-a796202b11d9\") " pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.384943 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g78l8\" (UniqueName: \"kubernetes.io/projected/c5420a13-8c3b-45fa-9c99-a796202b11d9-kube-api-access-g78l8\") pod \"perses-operator-5bf474d74f-tf59s\" (UID: \"c5420a13-8c3b-45fa-9c99-a796202b11d9\") " pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.386696 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(e51531d8f8a46708fe7825a84e86e7a2a1cb988e647a6ad6e16111a4bd366566): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.386761 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(e51531d8f8a46708fe7825a84e86e7a2a1cb988e647a6ad6e16111a4bd366566): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.386790 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(e51531d8f8a46708fe7825a84e86e7a2a1cb988e647a6ad6e16111a4bd366566): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.386843 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators(dde658b6-956e-4b8c-86b6-e707bfcc0dbf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators(dde658b6-956e-4b8c-86b6-e707bfcc0dbf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(e51531d8f8a46708fe7825a84e86e7a2a1cb988e647a6ad6e16111a4bd366566): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" podUID="dde658b6-956e-4b8c-86b6-e707bfcc0dbf" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.418285 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(ccb8a420c96027daeb772fa97da460f056e3ead3b537b6b2a4c8149998e13eb8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.418366 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(ccb8a420c96027daeb772fa97da460f056e3ead3b537b6b2a4c8149998e13eb8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.418385 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(ccb8a420c96027daeb772fa97da460f056e3ead3b537b6b2a4c8149998e13eb8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.418425 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators(419eee0b-c988-42e3-af4f-cef110425bb3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators(419eee0b-c988-42e3-af4f-cef110425bb3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(ccb8a420c96027daeb772fa97da460f056e3ead3b537b6b2a4c8149998e13eb8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" podUID="419eee0b-c988-42e3-af4f-cef110425bb3" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.482426 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.486579 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5420a13-8c3b-45fa-9c99-a796202b11d9-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tf59s\" (UID: \"c5420a13-8c3b-45fa-9c99-a796202b11d9\") " pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.486644 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g78l8\" (UniqueName: \"kubernetes.io/projected/c5420a13-8c3b-45fa-9c99-a796202b11d9-kube-api-access-g78l8\") pod \"perses-operator-5bf474d74f-tf59s\" (UID: \"c5420a13-8c3b-45fa-9c99-a796202b11d9\") " pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.487912 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5420a13-8c3b-45fa-9c99-a796202b11d9-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tf59s\" (UID: \"c5420a13-8c3b-45fa-9c99-a796202b11d9\") " pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.504323 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(a515183ea5c35eaaf3dafc04c12ebb6ef39de2e3cbb7ff75bf4f98441bfc8f6c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.504376 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(a515183ea5c35eaaf3dafc04c12ebb6ef39de2e3cbb7ff75bf4f98441bfc8f6c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.504397 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(a515183ea5c35eaaf3dafc04c12ebb6ef39de2e3cbb7ff75bf4f98441bfc8f6c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.504434 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-bmtvj_openshift-operators(b61de85a-5167-4af3-b14b-993cb20559fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-bmtvj_openshift-operators(b61de85a-5167-4af3-b14b-993cb20559fa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(a515183ea5c35eaaf3dafc04c12ebb6ef39de2e3cbb7ff75bf4f98441bfc8f6c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" podUID="b61de85a-5167-4af3-b14b-993cb20559fa" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.523525 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g78l8\" (UniqueName: \"kubernetes.io/projected/c5420a13-8c3b-45fa-9c99-a796202b11d9-kube-api-access-g78l8\") pod \"perses-operator-5bf474d74f-tf59s\" (UID: \"c5420a13-8c3b-45fa-9c99-a796202b11d9\") " pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:25 crc kubenswrapper[4722]: I0226 20:05:25.262240 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-rmk7l" Feb 26 20:05:25 crc kubenswrapper[4722]: I0226 20:05:25.269866 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:25 crc kubenswrapper[4722]: E0226 20:05:25.293181 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(d648ac8941555e697df962c504b4afd0e36f072c7c16d62d90a40172170d1ae5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:25 crc kubenswrapper[4722]: E0226 20:05:25.293256 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(d648ac8941555e697df962c504b4afd0e36f072c7c16d62d90a40172170d1ae5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:25 crc kubenswrapper[4722]: E0226 20:05:25.293282 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(d648ac8941555e697df962c504b4afd0e36f072c7c16d62d90a40172170d1ae5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:25 crc kubenswrapper[4722]: E0226 20:05:25.293338 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-tf59s_openshift-operators(c5420a13-8c3b-45fa-9c99-a796202b11d9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-tf59s_openshift-operators(c5420a13-8c3b-45fa-9c99-a796202b11d9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(d648ac8941555e697df962c504b4afd0e36f072c7c16d62d90a40172170d1ae5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" podUID="c5420a13-8c3b-45fa-9c99-a796202b11d9" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.288090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"19a8f7d41e2f601763c4de5f2750cc16b933e45d2d91a84cf5e8b92dc8636716"} Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.288765 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.288781 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.288799 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.316532 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.320456 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.324358 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" podStartSLOduration=7.324342508 podStartE2EDuration="7.324342508s" podCreationTimestamp="2026-02-26 20:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:05:27.321675466 +0000 UTC m=+669.858643410" watchObservedRunningTime="2026-02-26 20:05:27.324342508 +0000 UTC m=+669.861310432" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.514713 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-bmtvj"] Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.514861 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.515293 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.520035 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr"] Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.520199 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.520887 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.527644 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tf59s"] Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.527752 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.528219 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.538682 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4"] Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.538999 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.539486 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.544315 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp"] Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.544423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.544760 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.557234 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(f72cd3be70e7d554f6f3ab8dfa06e6bd27a60aa87290455b6f7f256396a4353e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.557309 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(f72cd3be70e7d554f6f3ab8dfa06e6bd27a60aa87290455b6f7f256396a4353e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.557336 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(f72cd3be70e7d554f6f3ab8dfa06e6bd27a60aa87290455b6f7f256396a4353e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.557381 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-bmtvj_openshift-operators(b61de85a-5167-4af3-b14b-993cb20559fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-bmtvj_openshift-operators(b61de85a-5167-4af3-b14b-993cb20559fa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(f72cd3be70e7d554f6f3ab8dfa06e6bd27a60aa87290455b6f7f256396a4353e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" podUID="b61de85a-5167-4af3-b14b-993cb20559fa" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.599183 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(fbae9c0e13a897d9417b5d196dfc3aa781b9f1b7bbf93ff14b9463d74249cec7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.599256 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(fbae9c0e13a897d9417b5d196dfc3aa781b9f1b7bbf93ff14b9463d74249cec7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.599276 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(fbae9c0e13a897d9417b5d196dfc3aa781b9f1b7bbf93ff14b9463d74249cec7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.599334 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators(419eee0b-c988-42e3-af4f-cef110425bb3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators(419eee0b-c988-42e3-af4f-cef110425bb3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(fbae9c0e13a897d9417b5d196dfc3aa781b9f1b7bbf93ff14b9463d74249cec7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" podUID="419eee0b-c988-42e3-af4f-cef110425bb3" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.612453 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(229f3769bf9ba6b493b29bf92ac553e55cda9aaa05727afc39828c86bf7dd436): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.612559 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(229f3769bf9ba6b493b29bf92ac553e55cda9aaa05727afc39828c86bf7dd436): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.612592 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(229f3769bf9ba6b493b29bf92ac553e55cda9aaa05727afc39828c86bf7dd436): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.612663 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-tf59s_openshift-operators(c5420a13-8c3b-45fa-9c99-a796202b11d9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-tf59s_openshift-operators(c5420a13-8c3b-45fa-9c99-a796202b11d9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(229f3769bf9ba6b493b29bf92ac553e55cda9aaa05727afc39828c86bf7dd436): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" podUID="c5420a13-8c3b-45fa-9c99-a796202b11d9" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.621726 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(00fb08b9b1f19f59908e7b00cded389389cfdd84e33c80ed7162cd030330662e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.622077 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(00fb08b9b1f19f59908e7b00cded389389cfdd84e33c80ed7162cd030330662e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.622126 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(00fb08b9b1f19f59908e7b00cded389389cfdd84e33c80ed7162cd030330662e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.622207 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators(edddb923-4396-43c9-880a-ed3ac0215808)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators(edddb923-4396-43c9-880a-ed3ac0215808)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(00fb08b9b1f19f59908e7b00cded389389cfdd84e33c80ed7162cd030330662e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" podUID="edddb923-4396-43c9-880a-ed3ac0215808" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.629270 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(8026e06b491b2324138bec0e3bf2500fc1068189c921eb61bc4d5384dcf49019): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.629345 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(8026e06b491b2324138bec0e3bf2500fc1068189c921eb61bc4d5384dcf49019): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.629379 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(8026e06b491b2324138bec0e3bf2500fc1068189c921eb61bc4d5384dcf49019): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.629449 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators(dde658b6-956e-4b8c-86b6-e707bfcc0dbf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators(dde658b6-956e-4b8c-86b6-e707bfcc0dbf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(8026e06b491b2324138bec0e3bf2500fc1068189c921eb61bc4d5384dcf49019): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" podUID="dde658b6-956e-4b8c-86b6-e707bfcc0dbf" Feb 26 20:05:35 crc kubenswrapper[4722]: I0226 20:05:35.146459 4722 scope.go:117] "RemoveContainer" containerID="9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097" Feb 26 20:05:35 crc kubenswrapper[4722]: I0226 20:05:35.332269 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/1.log" Feb 26 20:05:35 crc kubenswrapper[4722]: I0226 20:05:35.333013 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/0.log" Feb 26 20:05:35 crc kubenswrapper[4722]: I0226 20:05:35.333065 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cfwh9" event={"ID":"2bb99326-dd22-4186-84da-ba208f104cd6","Type":"ContainerStarted","Data":"4b6c63e92329c42ebf109326f8fdc39523b16d19e021d88f7ce4705b8bb0c92c"} Feb 26 20:05:39 crc kubenswrapper[4722]: I0226 20:05:39.145571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:39 crc kubenswrapper[4722]: I0226 20:05:39.146409 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:39 crc kubenswrapper[4722]: I0226 20:05:39.577643 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4"] Feb 26 20:05:39 crc kubenswrapper[4722]: W0226 20:05:39.589359 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedddb923_4396_43c9_880a_ed3ac0215808.slice/crio-8cdc89693de86af75b7d2905629fb423cddd61f6cd3617388eb8cf354d52fefe WatchSource:0}: Error finding container 8cdc89693de86af75b7d2905629fb423cddd61f6cd3617388eb8cf354d52fefe: Status 404 returned error can't find the container with id 8cdc89693de86af75b7d2905629fb423cddd61f6cd3617388eb8cf354d52fefe Feb 26 20:05:40 crc kubenswrapper[4722]: I0226 20:05:40.145337 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:40 crc kubenswrapper[4722]: I0226 20:05:40.145933 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:40 crc kubenswrapper[4722]: I0226 20:05:40.359404 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" event={"ID":"edddb923-4396-43c9-880a-ed3ac0215808","Type":"ContainerStarted","Data":"8cdc89693de86af75b7d2905629fb423cddd61f6cd3617388eb8cf354d52fefe"} Feb 26 20:05:40 crc kubenswrapper[4722]: I0226 20:05:40.448639 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr"] Feb 26 20:05:40 crc kubenswrapper[4722]: W0226 20:05:40.461659 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod419eee0b_c988_42e3_af4f_cef110425bb3.slice/crio-3c003108179c6bdf72199191b4846926ac029fc69503e982649a7dee29a6b3ed WatchSource:0}: Error finding container 3c003108179c6bdf72199191b4846926ac029fc69503e982649a7dee29a6b3ed: Status 404 returned error can't find the container with id 3c003108179c6bdf72199191b4846926ac029fc69503e982649a7dee29a6b3ed Feb 26 20:05:41 crc kubenswrapper[4722]: I0226 20:05:41.373787 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" event={"ID":"419eee0b-c988-42e3-af4f-cef110425bb3","Type":"ContainerStarted","Data":"3c003108179c6bdf72199191b4846926ac029fc69503e982649a7dee29a6b3ed"} Feb 26 20:05:42 crc kubenswrapper[4722]: I0226 20:05:42.145392 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:42 crc kubenswrapper[4722]: I0226 20:05:42.145460 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:42 crc kubenswrapper[4722]: I0226 20:05:42.145651 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:42 crc kubenswrapper[4722]: I0226 20:05:42.145886 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:42 crc kubenswrapper[4722]: I0226 20:05:42.145945 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:42 crc kubenswrapper[4722]: I0226 20:05:42.145981 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.028781 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-bmtvj"] Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.175754 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp"] Feb 26 20:05:45 crc kubenswrapper[4722]: W0226 20:05:45.179350 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddde658b6_956e_4b8c_86b6_e707bfcc0dbf.slice/crio-3e5f06fa516e287a29c7c64c970bad3ebe69a7be09390d223173e45c8a40ec4c WatchSource:0}: Error finding container 3e5f06fa516e287a29c7c64c970bad3ebe69a7be09390d223173e45c8a40ec4c: Status 404 returned error can't find the container with id 3e5f06fa516e287a29c7c64c970bad3ebe69a7be09390d223173e45c8a40ec4c Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.270157 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tf59s"] Feb 26 20:05:45 crc kubenswrapper[4722]: W0226 20:05:45.271110 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5420a13_8c3b_45fa_9c99_a796202b11d9.slice/crio-2e6bf308f53a1988eaa48563a78c9cfaf66ad176a14af5bceda70ab64030213d WatchSource:0}: Error finding container 2e6bf308f53a1988eaa48563a78c9cfaf66ad176a14af5bceda70ab64030213d: Status 404 returned error can't find the container with id 2e6bf308f53a1988eaa48563a78c9cfaf66ad176a14af5bceda70ab64030213d Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.396522 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" event={"ID":"dde658b6-956e-4b8c-86b6-e707bfcc0dbf","Type":"ContainerStarted","Data":"83a8c643fc5532362ae1862b174622955bd9b9b00f181a53f79214c0eec2c06c"} Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.396590 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" event={"ID":"dde658b6-956e-4b8c-86b6-e707bfcc0dbf","Type":"ContainerStarted","Data":"3e5f06fa516e287a29c7c64c970bad3ebe69a7be09390d223173e45c8a40ec4c"} Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.398006 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" event={"ID":"edddb923-4396-43c9-880a-ed3ac0215808","Type":"ContainerStarted","Data":"01d50fa762957456a8971e25001b30ded8e545cca90f724b336a60f0709134e6"} Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.399345 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" event={"ID":"c5420a13-8c3b-45fa-9c99-a796202b11d9","Type":"ContainerStarted","Data":"2e6bf308f53a1988eaa48563a78c9cfaf66ad176a14af5bceda70ab64030213d"} Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.400724 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" event={"ID":"b61de85a-5167-4af3-b14b-993cb20559fa","Type":"ContainerStarted","Data":"243f35ca5f6ceb3e09d0d56fe77efe616fc1a9e4aacaee3e00654b7a6b0ba56d"} Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.402038 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" event={"ID":"419eee0b-c988-42e3-af4f-cef110425bb3","Type":"ContainerStarted","Data":"e5f1834a6854388de4a66fff1ddc74da096550af2eaa1be87c6af750f7cee1c3"} Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.419102 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" podStartSLOduration=22.419082406 podStartE2EDuration="22.419082406s" podCreationTimestamp="2026-02-26 20:05:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:05:45.414505942 +0000 UTC m=+687.951473886" watchObservedRunningTime="2026-02-26 20:05:45.419082406 +0000 UTC m=+687.956050350" Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.472808 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" podStartSLOduration=17.2494652 podStartE2EDuration="22.47278761s" podCreationTimestamp="2026-02-26 20:05:23 +0000 UTC" firstStartedPulling="2026-02-26 20:05:39.590939346 +0000 UTC m=+682.127907270" lastFinishedPulling="2026-02-26 20:05:44.814261756 +0000 UTC m=+687.351229680" observedRunningTime="2026-02-26 20:05:45.461537056 +0000 UTC m=+687.998504990" watchObservedRunningTime="2026-02-26 20:05:45.47278761 +0000 UTC m=+688.009755534" Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.491531 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" podStartSLOduration=18.116260115 podStartE2EDuration="22.491511637s" podCreationTimestamp="2026-02-26 20:05:23 +0000 UTC" firstStartedPulling="2026-02-26 20:05:40.464570026 +0000 UTC m=+683.001537950" lastFinishedPulling="2026-02-26 20:05:44.839821548 +0000 UTC m=+687.376789472" observedRunningTime="2026-02-26 20:05:45.490812228 +0000 UTC m=+688.027780172" watchObservedRunningTime="2026-02-26 20:05:45.491511637 +0000 UTC m=+688.028479561" Feb 26 20:05:48 crc kubenswrapper[4722]: I0226 20:05:48.416127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" event={"ID":"c5420a13-8c3b-45fa-9c99-a796202b11d9","Type":"ContainerStarted","Data":"11093c9753f6516344cef6d1d069b98a14e4756687ccb34e733de882211adff0"} Feb 26 20:05:48 crc kubenswrapper[4722]: I0226 20:05:48.416679 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:48 crc kubenswrapper[4722]: I0226 20:05:48.435359 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" podStartSLOduration=22.068632756 podStartE2EDuration="24.435344502s" podCreationTimestamp="2026-02-26 20:05:24 +0000 UTC" firstStartedPulling="2026-02-26 20:05:45.274056278 +0000 UTC m=+687.811024202" lastFinishedPulling="2026-02-26 20:05:47.640768024 +0000 UTC m=+690.177735948" observedRunningTime="2026-02-26 20:05:48.432200257 +0000 UTC m=+690.969168201" watchObservedRunningTime="2026-02-26 20:05:48.435344502 +0000 UTC m=+690.972312426" Feb 26 20:05:50 crc kubenswrapper[4722]: I0226 20:05:50.428903 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" event={"ID":"b61de85a-5167-4af3-b14b-993cb20559fa","Type":"ContainerStarted","Data":"2d7b6eafb392012cdf85c154c0bf12a1a011501d0389ee08fff1a2176736fe5b"} Feb 26 20:05:50 crc kubenswrapper[4722]: I0226 20:05:50.429348 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:50 crc kubenswrapper[4722]: I0226 20:05:50.433120 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-bmtvj container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.19:8081/healthz\": dial tcp 10.217.0.19:8081: connect: connection refused" start-of-body= Feb 26 20:05:50 crc kubenswrapper[4722]: I0226 20:05:50.433202 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" podUID="b61de85a-5167-4af3-b14b-993cb20559fa" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.19:8081/healthz\": dial tcp 10.217.0.19:8081: connect: connection refused" Feb 26 20:05:50 crc kubenswrapper[4722]: I0226 20:05:50.451864 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" podStartSLOduration=21.280587134 podStartE2EDuration="26.451843974s" podCreationTimestamp="2026-02-26 20:05:24 +0000 UTC" firstStartedPulling="2026-02-26 20:05:45.112237525 +0000 UTC m=+687.649205449" lastFinishedPulling="2026-02-26 20:05:50.283494365 +0000 UTC m=+692.820462289" observedRunningTime="2026-02-26 20:05:50.446445258 +0000 UTC m=+692.983413202" watchObservedRunningTime="2026-02-26 20:05:50.451843974 +0000 UTC m=+692.988811898" Feb 26 20:05:50 crc kubenswrapper[4722]: I0226 20:05:50.973834 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:51 crc kubenswrapper[4722]: I0226 20:05:51.434619 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:55 crc kubenswrapper[4722]: I0226 20:05:55.274957 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.923692 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-frp6h"] Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.924370 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.928403 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2fcf9" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.928434 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.928472 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.940026 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-frp6h"] Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.945431 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-9d76n"] Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.946209 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-9d76n" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.949006 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8768h" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.949953 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-45hpn"] Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.950799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.954129 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-fdbr6" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.973087 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-9d76n"] Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.980621 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-45hpn"] Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.045125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qd89\" (UniqueName: \"kubernetes.io/projected/4b627d55-dcd7-42c6-948f-a50f17bc7688-kube-api-access-5qd89\") pod \"cert-manager-webhook-687f57d79b-45hpn\" (UID: \"4b627d55-dcd7-42c6-948f-a50f17bc7688\") " pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.045206 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgxlz\" (UniqueName: \"kubernetes.io/projected/d66ba312-de97-438e-a172-5bcd2b6ef4db-kube-api-access-xgxlz\") pod \"cert-manager-858654f9db-9d76n\" (UID: \"d66ba312-de97-438e-a172-5bcd2b6ef4db\") " pod="cert-manager/cert-manager-858654f9db-9d76n" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.045280 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztnhh\" (UniqueName: \"kubernetes.io/projected/c966e2d5-2260-4d2f-ab59-4658284e872d-kube-api-access-ztnhh\") pod \"cert-manager-cainjector-cf98fcc89-frp6h\" (UID: \"c966e2d5-2260-4d2f-ab59-4658284e872d\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.146168 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztnhh\" (UniqueName: \"kubernetes.io/projected/c966e2d5-2260-4d2f-ab59-4658284e872d-kube-api-access-ztnhh\") pod \"cert-manager-cainjector-cf98fcc89-frp6h\" (UID: \"c966e2d5-2260-4d2f-ab59-4658284e872d\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.146267 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qd89\" (UniqueName: \"kubernetes.io/projected/4b627d55-dcd7-42c6-948f-a50f17bc7688-kube-api-access-5qd89\") pod \"cert-manager-webhook-687f57d79b-45hpn\" (UID: \"4b627d55-dcd7-42c6-948f-a50f17bc7688\") " pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.146300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgxlz\" (UniqueName: \"kubernetes.io/projected/d66ba312-de97-438e-a172-5bcd2b6ef4db-kube-api-access-xgxlz\") pod \"cert-manager-858654f9db-9d76n\" (UID: \"d66ba312-de97-438e-a172-5bcd2b6ef4db\") " pod="cert-manager/cert-manager-858654f9db-9d76n" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.175395 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztnhh\" (UniqueName: \"kubernetes.io/projected/c966e2d5-2260-4d2f-ab59-4658284e872d-kube-api-access-ztnhh\") pod \"cert-manager-cainjector-cf98fcc89-frp6h\" (UID: \"c966e2d5-2260-4d2f-ab59-4658284e872d\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.186838 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgxlz\" (UniqueName: \"kubernetes.io/projected/d66ba312-de97-438e-a172-5bcd2b6ef4db-kube-api-access-xgxlz\") pod \"cert-manager-858654f9db-9d76n\" (UID: \"d66ba312-de97-438e-a172-5bcd2b6ef4db\") " pod="cert-manager/cert-manager-858654f9db-9d76n" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.200998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qd89\" (UniqueName: \"kubernetes.io/projected/4b627d55-dcd7-42c6-948f-a50f17bc7688-kube-api-access-5qd89\") pod \"cert-manager-webhook-687f57d79b-45hpn\" (UID: \"4b627d55-dcd7-42c6-948f-a50f17bc7688\") " pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.240623 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.259994 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-9d76n" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.265513 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.523093 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-9d76n"] Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.626309 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-frp6h"] Feb 26 20:05:57 crc kubenswrapper[4722]: W0226 20:05:57.632532 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc966e2d5_2260_4d2f_ab59_4658284e872d.slice/crio-9bd95d39e32c724d73b26d2d6d2fdc4ce845a5ce176fc9f996f0d92e9d4f11ee WatchSource:0}: Error finding container 9bd95d39e32c724d73b26d2d6d2fdc4ce845a5ce176fc9f996f0d92e9d4f11ee: Status 404 returned error can't find the container with id 9bd95d39e32c724d73b26d2d6d2fdc4ce845a5ce176fc9f996f0d92e9d4f11ee Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.831340 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-45hpn"] Feb 26 20:05:57 crc kubenswrapper[4722]: W0226 20:05:57.835615 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b627d55_dcd7_42c6_948f_a50f17bc7688.slice/crio-31ad88e515c2e4a80d7dee3ac55941af7dd2773e0225b3eee381df6d619c0e3f WatchSource:0}: Error finding container 31ad88e515c2e4a80d7dee3ac55941af7dd2773e0225b3eee381df6d619c0e3f: Status 404 returned error can't find the container with id 31ad88e515c2e4a80d7dee3ac55941af7dd2773e0225b3eee381df6d619c0e3f Feb 26 20:05:58 crc kubenswrapper[4722]: I0226 20:05:58.492909 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" event={"ID":"4b627d55-dcd7-42c6-948f-a50f17bc7688","Type":"ContainerStarted","Data":"31ad88e515c2e4a80d7dee3ac55941af7dd2773e0225b3eee381df6d619c0e3f"} Feb 26 20:05:58 crc kubenswrapper[4722]: I0226 20:05:58.494013 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" event={"ID":"c966e2d5-2260-4d2f-ab59-4658284e872d","Type":"ContainerStarted","Data":"9bd95d39e32c724d73b26d2d6d2fdc4ce845a5ce176fc9f996f0d92e9d4f11ee"} Feb 26 20:05:58 crc kubenswrapper[4722]: I0226 20:05:58.495121 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-9d76n" event={"ID":"d66ba312-de97-438e-a172-5bcd2b6ef4db","Type":"ContainerStarted","Data":"7802d043e7fde35eac44da85b9c976969cfc402b9c36498228a7ec728d37fef5"} Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.128893 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535606-csqpb"] Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.129556 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.130994 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.131324 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.131718 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.136754 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535606-csqpb"] Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.197712 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt2qr\" (UniqueName: \"kubernetes.io/projected/e3133c2f-ea60-41e1-bf7e-443c44a47c41-kube-api-access-kt2qr\") pod \"auto-csr-approver-29535606-csqpb\" (UID: \"e3133c2f-ea60-41e1-bf7e-443c44a47c41\") " pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.298422 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt2qr\" (UniqueName: \"kubernetes.io/projected/e3133c2f-ea60-41e1-bf7e-443c44a47c41-kube-api-access-kt2qr\") pod \"auto-csr-approver-29535606-csqpb\" (UID: \"e3133c2f-ea60-41e1-bf7e-443c44a47c41\") " pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.320175 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt2qr\" (UniqueName: \"kubernetes.io/projected/e3133c2f-ea60-41e1-bf7e-443c44a47c41-kube-api-access-kt2qr\") pod \"auto-csr-approver-29535606-csqpb\" (UID: \"e3133c2f-ea60-41e1-bf7e-443c44a47c41\") " pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.446938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.349231 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535606-csqpb"] Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.526161 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535606-csqpb" event={"ID":"e3133c2f-ea60-41e1-bf7e-443c44a47c41","Type":"ContainerStarted","Data":"44e9e0a0c696180041b873fc01e0fd9189dddfd53bd7e124bf43b3b62a77df59"} Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.528543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-9d76n" event={"ID":"d66ba312-de97-438e-a172-5bcd2b6ef4db","Type":"ContainerStarted","Data":"917e8016b9a12507479a94960535dcbaeb617f8ff9962d807ae8a5a748009a8d"} Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.530380 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" event={"ID":"4b627d55-dcd7-42c6-948f-a50f17bc7688","Type":"ContainerStarted","Data":"b24a809778990e974d6ad271b44c5affdee719057f58d432a55da1bb5d6eec9e"} Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.530498 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.532949 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" event={"ID":"c966e2d5-2260-4d2f-ab59-4658284e872d","Type":"ContainerStarted","Data":"9fab6402798922e9f2c091c20801b632303342af16271c9f773968b5dada9eff"} Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.549365 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-9d76n" podStartSLOduration=1.919343161 podStartE2EDuration="6.549346312s" podCreationTimestamp="2026-02-26 20:05:56 +0000 UTC" firstStartedPulling="2026-02-26 20:05:57.538177088 +0000 UTC m=+700.075145002" lastFinishedPulling="2026-02-26 20:06:02.168180229 +0000 UTC m=+704.705148153" observedRunningTime="2026-02-26 20:06:02.546625089 +0000 UTC m=+705.083593013" watchObservedRunningTime="2026-02-26 20:06:02.549346312 +0000 UTC m=+705.086314236" Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.572259 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" podStartSLOduration=1.985530694 podStartE2EDuration="6.572228892s" podCreationTimestamp="2026-02-26 20:05:56 +0000 UTC" firstStartedPulling="2026-02-26 20:05:57.63609905 +0000 UTC m=+700.173066974" lastFinishedPulling="2026-02-26 20:06:02.222797248 +0000 UTC m=+704.759765172" observedRunningTime="2026-02-26 20:06:02.568448289 +0000 UTC m=+705.105416213" watchObservedRunningTime="2026-02-26 20:06:02.572228892 +0000 UTC m=+705.109196846" Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.590246 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" podStartSLOduration=2.260198683 podStartE2EDuration="6.590226249s" podCreationTimestamp="2026-02-26 20:05:56 +0000 UTC" firstStartedPulling="2026-02-26 20:05:57.837732611 +0000 UTC m=+700.374700535" lastFinishedPulling="2026-02-26 20:06:02.167760167 +0000 UTC m=+704.704728101" observedRunningTime="2026-02-26 20:06:02.589476959 +0000 UTC m=+705.126444883" watchObservedRunningTime="2026-02-26 20:06:02.590226249 +0000 UTC m=+705.127194173" Feb 26 20:06:04 crc kubenswrapper[4722]: I0226 20:06:04.545538 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3133c2f-ea60-41e1-bf7e-443c44a47c41" containerID="1f34805f891bdef575a93bdd795f3e9cbcb41a3be9f3e37998f1db71c779fd63" exitCode=0 Feb 26 20:06:04 crc kubenswrapper[4722]: I0226 20:06:04.545638 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535606-csqpb" event={"ID":"e3133c2f-ea60-41e1-bf7e-443c44a47c41","Type":"ContainerDied","Data":"1f34805f891bdef575a93bdd795f3e9cbcb41a3be9f3e37998f1db71c779fd63"} Feb 26 20:06:05 crc kubenswrapper[4722]: I0226 20:06:05.803366 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:05 crc kubenswrapper[4722]: I0226 20:06:05.873985 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt2qr\" (UniqueName: \"kubernetes.io/projected/e3133c2f-ea60-41e1-bf7e-443c44a47c41-kube-api-access-kt2qr\") pod \"e3133c2f-ea60-41e1-bf7e-443c44a47c41\" (UID: \"e3133c2f-ea60-41e1-bf7e-443c44a47c41\") " Feb 26 20:06:05 crc kubenswrapper[4722]: I0226 20:06:05.879983 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3133c2f-ea60-41e1-bf7e-443c44a47c41-kube-api-access-kt2qr" (OuterVolumeSpecName: "kube-api-access-kt2qr") pod "e3133c2f-ea60-41e1-bf7e-443c44a47c41" (UID: "e3133c2f-ea60-41e1-bf7e-443c44a47c41"). InnerVolumeSpecName "kube-api-access-kt2qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:06:05 crc kubenswrapper[4722]: I0226 20:06:05.975410 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt2qr\" (UniqueName: \"kubernetes.io/projected/e3133c2f-ea60-41e1-bf7e-443c44a47c41-kube-api-access-kt2qr\") on node \"crc\" DevicePath \"\"" Feb 26 20:06:06 crc kubenswrapper[4722]: I0226 20:06:06.559801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535606-csqpb" event={"ID":"e3133c2f-ea60-41e1-bf7e-443c44a47c41","Type":"ContainerDied","Data":"44e9e0a0c696180041b873fc01e0fd9189dddfd53bd7e124bf43b3b62a77df59"} Feb 26 20:06:06 crc kubenswrapper[4722]: I0226 20:06:06.560121 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44e9e0a0c696180041b873fc01e0fd9189dddfd53bd7e124bf43b3b62a77df59" Feb 26 20:06:06 crc kubenswrapper[4722]: I0226 20:06:06.559848 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:06 crc kubenswrapper[4722]: I0226 20:06:06.875340 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535600-2lg25"] Feb 26 20:06:06 crc kubenswrapper[4722]: I0226 20:06:06.881562 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535600-2lg25"] Feb 26 20:06:07 crc kubenswrapper[4722]: I0226 20:06:07.268470 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:06:08 crc kubenswrapper[4722]: I0226 20:06:08.154210 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f39028f-65ac-4f51-a946-4cc88d7dc31b" path="/var/lib/kubelet/pods/6f39028f-65ac-4f51-a946-4cc88d7dc31b/volumes" Feb 26 20:06:18 crc kubenswrapper[4722]: I0226 20:06:18.582253 4722 scope.go:117] "RemoveContainer" containerID="0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855" Feb 26 20:06:19 crc kubenswrapper[4722]: I0226 20:06:19.633879 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/1.log" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.874666 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g"] Feb 26 20:06:30 crc kubenswrapper[4722]: E0226 20:06:30.875320 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3133c2f-ea60-41e1-bf7e-443c44a47c41" containerName="oc" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.875332 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3133c2f-ea60-41e1-bf7e-443c44a47c41" containerName="oc" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.875435 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3133c2f-ea60-41e1-bf7e-443c44a47c41" containerName="oc" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.876129 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.879183 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.888168 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g"] Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.993658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.994292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.994425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpknm\" (UniqueName: \"kubernetes.io/projected/948aa1c0-1136-4f5a-a049-404618cb2a54-kube-api-access-rpknm\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.095215 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.095276 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpknm\" (UniqueName: \"kubernetes.io/projected/948aa1c0-1136-4f5a-a049-404618cb2a54-kube-api-access-rpknm\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.095315 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.095928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.096253 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.115344 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpknm\" (UniqueName: \"kubernetes.io/projected/948aa1c0-1136-4f5a-a049-404618cb2a54-kube-api-access-rpknm\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.200423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.382461 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g"] Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.709587 4722 generic.go:334] "Generic (PLEG): container finished" podID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerID="c7c604649b77bae3a8df464296684b5e1511bd96c42a0fbfe1eedae08037c686" exitCode=0 Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.709637 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" event={"ID":"948aa1c0-1136-4f5a-a049-404618cb2a54","Type":"ContainerDied","Data":"c7c604649b77bae3a8df464296684b5e1511bd96c42a0fbfe1eedae08037c686"} Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.709893 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" event={"ID":"948aa1c0-1136-4f5a-a049-404618cb2a54","Type":"ContainerStarted","Data":"f79f2202c5ae1a3adc8db16996471961ca957cba98a30a728d931f59008c9903"} Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.472531 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.473223 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.475042 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.475418 4722 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-tjl49" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.475895 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.482393 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.623503 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7x2s\" (UniqueName: \"kubernetes.io/projected/19df822a-3fc6-4a7a-a62e-2bf21c7b1739-kube-api-access-m7x2s\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") " pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.623589 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") " pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.725268 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7x2s\" (UniqueName: \"kubernetes.io/projected/19df822a-3fc6-4a7a-a62e-2bf21c7b1739-kube-api-access-m7x2s\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") " pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.725308 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") " pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.727887 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.727924 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3c101122d83be68c25b357750bfc70e16d81943ec71123fb549bfa77291905ce/globalmount\"" pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.744249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7x2s\" (UniqueName: \"kubernetes.io/projected/19df822a-3fc6-4a7a-a62e-2bf21c7b1739-kube-api-access-m7x2s\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") " pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.748478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") " pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.788584 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.979032 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 26 20:06:33 crc kubenswrapper[4722]: I0226 20:06:33.733887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"19df822a-3fc6-4a7a-a62e-2bf21c7b1739","Type":"ContainerStarted","Data":"d62b8d6cee033adb0479d877680e483a868699345fa6a52975005796f1767ce9"} Feb 26 20:06:33 crc kubenswrapper[4722]: I0226 20:06:33.736683 4722 generic.go:334] "Generic (PLEG): container finished" podID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerID="efd21a6a8c8e5f3d369ec3d1ff7ad8317a88ac5bc0cc337749ceb922b3f29b32" exitCode=0 Feb 26 20:06:33 crc kubenswrapper[4722]: I0226 20:06:33.736718 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" event={"ID":"948aa1c0-1136-4f5a-a049-404618cb2a54","Type":"ContainerDied","Data":"efd21a6a8c8e5f3d369ec3d1ff7ad8317a88ac5bc0cc337749ceb922b3f29b32"} Feb 26 20:06:34 crc kubenswrapper[4722]: I0226 20:06:34.745037 4722 generic.go:334] "Generic (PLEG): container finished" podID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerID="ff6ba6dc24fa5036fd2d5de6b0bdf8d3dd3182a36fac79a25b018ac1353fd33c" exitCode=0 Feb 26 20:06:34 crc kubenswrapper[4722]: I0226 20:06:34.745182 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" event={"ID":"948aa1c0-1136-4f5a-a049-404618cb2a54","Type":"ContainerDied","Data":"ff6ba6dc24fa5036fd2d5de6b0bdf8d3dd3182a36fac79a25b018ac1353fd33c"} Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.098729 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.268801 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpknm\" (UniqueName: \"kubernetes.io/projected/948aa1c0-1136-4f5a-a049-404618cb2a54-kube-api-access-rpknm\") pod \"948aa1c0-1136-4f5a-a049-404618cb2a54\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.268855 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-util\") pod \"948aa1c0-1136-4f5a-a049-404618cb2a54\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.268890 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-bundle\") pod \"948aa1c0-1136-4f5a-a049-404618cb2a54\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.269968 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-bundle" (OuterVolumeSpecName: "bundle") pod "948aa1c0-1136-4f5a-a049-404618cb2a54" (UID: "948aa1c0-1136-4f5a-a049-404618cb2a54"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.274834 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948aa1c0-1136-4f5a-a049-404618cb2a54-kube-api-access-rpknm" (OuterVolumeSpecName: "kube-api-access-rpknm") pod "948aa1c0-1136-4f5a-a049-404618cb2a54" (UID: "948aa1c0-1136-4f5a-a049-404618cb2a54"). InnerVolumeSpecName "kube-api-access-rpknm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.286665 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-util" (OuterVolumeSpecName: "util") pod "948aa1c0-1136-4f5a-a049-404618cb2a54" (UID: "948aa1c0-1136-4f5a-a049-404618cb2a54"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.370730 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpknm\" (UniqueName: \"kubernetes.io/projected/948aa1c0-1136-4f5a-a049-404618cb2a54-kube-api-access-rpknm\") on node \"crc\" DevicePath \"\"" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.370772 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-util\") on node \"crc\" DevicePath \"\"" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.370788 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.756847 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.756831 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" event={"ID":"948aa1c0-1136-4f5a-a049-404618cb2a54","Type":"ContainerDied","Data":"f79f2202c5ae1a3adc8db16996471961ca957cba98a30a728d931f59008c9903"} Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.757266 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f79f2202c5ae1a3adc8db16996471961ca957cba98a30a728d931f59008c9903" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.757758 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"19df822a-3fc6-4a7a-a62e-2bf21c7b1739","Type":"ContainerStarted","Data":"7eb6129873860dfe01e5b5f99d58be634f6ac84209e7ac6b844f0dad545ad17f"} Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.774560 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.581754076 podStartE2EDuration="6.774539167s" podCreationTimestamp="2026-02-26 20:06:30 +0000 UTC" firstStartedPulling="2026-02-26 20:06:32.986888757 +0000 UTC m=+735.523856681" lastFinishedPulling="2026-02-26 20:06:36.179673848 +0000 UTC m=+738.716641772" observedRunningTime="2026-02-26 20:06:36.76987302 +0000 UTC m=+739.306840954" watchObservedRunningTime="2026-02-26 20:06:36.774539167 +0000 UTC m=+739.311507091" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.280795 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2"] Feb 26 20:06:42 crc kubenswrapper[4722]: E0226 20:06:42.281295 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="pull" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.281306 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="pull" Feb 26 20:06:42 crc kubenswrapper[4722]: E0226 20:06:42.281319 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="util" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.281325 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="util" Feb 26 20:06:42 crc kubenswrapper[4722]: E0226 20:06:42.281338 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="extract" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.281344 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="extract" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.281440 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="extract" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.282034 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.285012 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.285472 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.285473 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.285886 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.285946 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.286503 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-t4p9s" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.304039 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2"] Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.445422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw6ml\" (UniqueName: \"kubernetes.io/projected/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-kube-api-access-qw6ml\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.445642 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-webhook-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.445681 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-manager-config\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.445701 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.445722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-apiservice-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.547293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw6ml\" (UniqueName: \"kubernetes.io/projected/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-kube-api-access-qw6ml\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.547354 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-webhook-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.547393 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-manager-config\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.547414 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.547435 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-apiservice-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.548321 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-manager-config\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.553895 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-apiservice-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.553931 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.553895 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-webhook-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.564826 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw6ml\" (UniqueName: \"kubernetes.io/projected/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-kube-api-access-qw6ml\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.597844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:43 crc kubenswrapper[4722]: I0226 20:06:43.101028 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2"] Feb 26 20:06:43 crc kubenswrapper[4722]: I0226 20:06:43.796442 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" event={"ID":"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e","Type":"ContainerStarted","Data":"86ec2010ff9ffee303c5b8cdcd734934e12d16b78f975233361cba2b27b77183"} Feb 26 20:06:53 crc kubenswrapper[4722]: I0226 20:06:53.486924 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:06:53 crc kubenswrapper[4722]: I0226 20:06:53.487582 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:07:08 crc kubenswrapper[4722]: I0226 20:07:08.944647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" event={"ID":"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e","Type":"ContainerStarted","Data":"fd80d951f4b214ba285f65186bbbeb0264b79d481d126b16545031fcb045c862"} Feb 26 20:07:13 crc kubenswrapper[4722]: I0226 20:07:13.978145 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" event={"ID":"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e","Type":"ContainerStarted","Data":"f6710b3b6ddd7a1ce13130c24e5ec946ddd45f4e9e571310dacd22eb7b13ee9e"} Feb 26 20:07:13 crc kubenswrapper[4722]: I0226 20:07:13.980242 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:07:13 crc kubenswrapper[4722]: I0226 20:07:13.983253 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:07:14 crc kubenswrapper[4722]: I0226 20:07:14.005576 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" podStartSLOduration=1.5505903970000001 podStartE2EDuration="32.005516388s" podCreationTimestamp="2026-02-26 20:06:42 +0000 UTC" firstStartedPulling="2026-02-26 20:06:43.110418053 +0000 UTC m=+745.647385977" lastFinishedPulling="2026-02-26 20:07:13.565344044 +0000 UTC m=+776.102311968" observedRunningTime="2026-02-26 20:07:14.000356448 +0000 UTC m=+776.537324382" watchObservedRunningTime="2026-02-26 20:07:14.005516388 +0000 UTC m=+776.542484312" Feb 26 20:07:18 crc kubenswrapper[4722]: I0226 20:07:18.641614 4722 scope.go:117] "RemoveContainer" containerID="f6af5fba5101db3b527e91e588fb071f728196c140b56f368badc532d02686d0" Feb 26 20:07:23 crc kubenswrapper[4722]: I0226 20:07:23.486960 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:07:23 crc kubenswrapper[4722]: I0226 20:07:23.487308 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.135304 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p"] Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.136968 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.139661 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.140911 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p"] Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.325643 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb4c9\" (UniqueName: \"kubernetes.io/projected/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-kube-api-access-sb4c9\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.325727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.325814 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.427326 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.427646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.427736 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb4c9\" (UniqueName: \"kubernetes.io/projected/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-kube-api-access-sb4c9\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.427904 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.428105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.446405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb4c9\" (UniqueName: \"kubernetes.io/projected/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-kube-api-access-sb4c9\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.457430 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.631284 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p"] Feb 26 20:07:38 crc kubenswrapper[4722]: I0226 20:07:38.107544 4722 generic.go:334] "Generic (PLEG): container finished" podID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerID="e4435b8edb263b54ee2a9cce3ec7e17733433e2e18f9cdb40db5d278b25c3562" exitCode=0 Feb 26 20:07:38 crc kubenswrapper[4722]: I0226 20:07:38.107594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" event={"ID":"1d91d18f-070e-4d68-adfc-f9e32d4a1f39","Type":"ContainerDied","Data":"e4435b8edb263b54ee2a9cce3ec7e17733433e2e18f9cdb40db5d278b25c3562"} Feb 26 20:07:38 crc kubenswrapper[4722]: I0226 20:07:38.107635 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" event={"ID":"1d91d18f-070e-4d68-adfc-f9e32d4a1f39","Type":"ContainerStarted","Data":"db2e3f730b2838aef1678b0244dc2fa58434f44ce2fc3b9a09de85445a14576d"} Feb 26 20:07:40 crc kubenswrapper[4722]: I0226 20:07:40.122066 4722 generic.go:334] "Generic (PLEG): container finished" podID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerID="1e0bda2100592b2b977b94b70ed4d6bca6a65bc1d20d0e684df9b6670503538c" exitCode=0 Feb 26 20:07:40 crc kubenswrapper[4722]: I0226 20:07:40.122107 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" event={"ID":"1d91d18f-070e-4d68-adfc-f9e32d4a1f39","Type":"ContainerDied","Data":"1e0bda2100592b2b977b94b70ed4d6bca6a65bc1d20d0e684df9b6670503538c"} Feb 26 20:07:41 crc kubenswrapper[4722]: I0226 20:07:41.130351 4722 generic.go:334] "Generic (PLEG): container finished" podID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerID="f322977a2cbc4b3045471fff6b03ba8000ca3490d94b013cb2f1c9b31316dddb" exitCode=0 Feb 26 20:07:41 crc kubenswrapper[4722]: I0226 20:07:41.130647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" event={"ID":"1d91d18f-070e-4d68-adfc-f9e32d4a1f39","Type":"ContainerDied","Data":"f322977a2cbc4b3045471fff6b03ba8000ca3490d94b013cb2f1c9b31316dddb"} Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.404520 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.487534 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb4c9\" (UniqueName: \"kubernetes.io/projected/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-kube-api-access-sb4c9\") pod \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.487592 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-bundle\") pod \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.487705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-util\") pod \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.489027 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-bundle" (OuterVolumeSpecName: "bundle") pod "1d91d18f-070e-4d68-adfc-f9e32d4a1f39" (UID: "1d91d18f-070e-4d68-adfc-f9e32d4a1f39"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.493089 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-kube-api-access-sb4c9" (OuterVolumeSpecName: "kube-api-access-sb4c9") pod "1d91d18f-070e-4d68-adfc-f9e32d4a1f39" (UID: "1d91d18f-070e-4d68-adfc-f9e32d4a1f39"). InnerVolumeSpecName "kube-api-access-sb4c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.501901 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-util" (OuterVolumeSpecName: "util") pod "1d91d18f-070e-4d68-adfc-f9e32d4a1f39" (UID: "1d91d18f-070e-4d68-adfc-f9e32d4a1f39"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.589299 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.589329 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-util\") on node \"crc\" DevicePath \"\"" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.589338 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb4c9\" (UniqueName: \"kubernetes.io/projected/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-kube-api-access-sb4c9\") on node \"crc\" DevicePath \"\"" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.679164 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g2lbh"] Feb 26 20:07:42 crc kubenswrapper[4722]: E0226 20:07:42.679427 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="util" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.679443 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="util" Feb 26 20:07:42 crc kubenswrapper[4722]: E0226 20:07:42.679455 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="pull" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.679462 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="pull" Feb 26 20:07:42 crc kubenswrapper[4722]: E0226 20:07:42.679475 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="extract" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.679482 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="extract" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.679611 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="extract" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.680517 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.699063 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g2lbh"] Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.791587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-utilities\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.791645 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-catalog-content\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.791815 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxsws\" (UniqueName: \"kubernetes.io/projected/d4cfa957-34b0-4b59-a010-4cfb763f0564-kube-api-access-sxsws\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.893375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-utilities\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.893421 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-catalog-content\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.893470 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxsws\" (UniqueName: \"kubernetes.io/projected/d4cfa957-34b0-4b59-a010-4cfb763f0564-kube-api-access-sxsws\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.893902 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-utilities\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.894033 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-catalog-content\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.931741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxsws\" (UniqueName: \"kubernetes.io/projected/d4cfa957-34b0-4b59-a010-4cfb763f0564-kube-api-access-sxsws\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.997637 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:43 crc kubenswrapper[4722]: I0226 20:07:43.145469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" event={"ID":"1d91d18f-070e-4d68-adfc-f9e32d4a1f39","Type":"ContainerDied","Data":"db2e3f730b2838aef1678b0244dc2fa58434f44ce2fc3b9a09de85445a14576d"} Feb 26 20:07:43 crc kubenswrapper[4722]: I0226 20:07:43.145506 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db2e3f730b2838aef1678b0244dc2fa58434f44ce2fc3b9a09de85445a14576d" Feb 26 20:07:43 crc kubenswrapper[4722]: I0226 20:07:43.145568 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:43 crc kubenswrapper[4722]: I0226 20:07:43.432213 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g2lbh"] Feb 26 20:07:44 crc kubenswrapper[4722]: I0226 20:07:44.151794 4722 generic.go:334] "Generic (PLEG): container finished" podID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerID="ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2" exitCode=0 Feb 26 20:07:44 crc kubenswrapper[4722]: I0226 20:07:44.152473 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2lbh" event={"ID":"d4cfa957-34b0-4b59-a010-4cfb763f0564","Type":"ContainerDied","Data":"ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2"} Feb 26 20:07:44 crc kubenswrapper[4722]: I0226 20:07:44.152523 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2lbh" event={"ID":"d4cfa957-34b0-4b59-a010-4cfb763f0564","Type":"ContainerStarted","Data":"38e167c2e244696bea734e2faa94be954c00560998499fd0aff68b2debca0404"} Feb 26 20:07:45 crc kubenswrapper[4722]: I0226 20:07:45.310234 4722 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 26 20:07:45 crc kubenswrapper[4722]: I0226 20:07:45.994556 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c"] Feb 26 20:07:45 crc kubenswrapper[4722]: I0226 20:07:45.995964 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" Feb 26 20:07:45 crc kubenswrapper[4722]: I0226 20:07:45.998786 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 26 20:07:45 crc kubenswrapper[4722]: I0226 20:07:45.999313 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rs9hh" Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.000042 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.013500 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c"] Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.135869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57w6n\" (UniqueName: \"kubernetes.io/projected/a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb-kube-api-access-57w6n\") pod \"nmstate-operator-75c5dccd6c-lpl8c\" (UID: \"a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.165393 4722 generic.go:334] "Generic (PLEG): container finished" podID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerID="d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e" exitCode=0 Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.165463 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2lbh" event={"ID":"d4cfa957-34b0-4b59-a010-4cfb763f0564","Type":"ContainerDied","Data":"d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e"} Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.236511 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57w6n\" (UniqueName: \"kubernetes.io/projected/a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb-kube-api-access-57w6n\") pod \"nmstate-operator-75c5dccd6c-lpl8c\" (UID: \"a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.264120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57w6n\" (UniqueName: \"kubernetes.io/projected/a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb-kube-api-access-57w6n\") pod \"nmstate-operator-75c5dccd6c-lpl8c\" (UID: \"a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.309839 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.643606 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c"] Feb 26 20:07:46 crc kubenswrapper[4722]: W0226 20:07:46.651844 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4ddbbd5_3eef_4fc9_ab2b_20e2572538cb.slice/crio-40880ece5f99f41f785da1d0d3547e3d901d207ec8c648c294022472f8ea0136 WatchSource:0}: Error finding container 40880ece5f99f41f785da1d0d3547e3d901d207ec8c648c294022472f8ea0136: Status 404 returned error can't find the container with id 40880ece5f99f41f785da1d0d3547e3d901d207ec8c648c294022472f8ea0136 Feb 26 20:07:47 crc kubenswrapper[4722]: I0226 20:07:47.171341 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" event={"ID":"a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb","Type":"ContainerStarted","Data":"40880ece5f99f41f785da1d0d3547e3d901d207ec8c648c294022472f8ea0136"} Feb 26 20:07:47 crc kubenswrapper[4722]: I0226 20:07:47.173199 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2lbh" event={"ID":"d4cfa957-34b0-4b59-a010-4cfb763f0564","Type":"ContainerStarted","Data":"e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24"} Feb 26 20:07:47 crc kubenswrapper[4722]: I0226 20:07:47.197482 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g2lbh" podStartSLOduration=2.775192238 podStartE2EDuration="5.197463514s" podCreationTimestamp="2026-02-26 20:07:42 +0000 UTC" firstStartedPulling="2026-02-26 20:07:44.153339639 +0000 UTC m=+806.690307563" lastFinishedPulling="2026-02-26 20:07:46.575610915 +0000 UTC m=+809.112578839" observedRunningTime="2026-02-26 20:07:47.192006246 +0000 UTC m=+809.728974190" watchObservedRunningTime="2026-02-26 20:07:47.197463514 +0000 UTC m=+809.734431438" Feb 26 20:07:50 crc kubenswrapper[4722]: I0226 20:07:50.192200 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" event={"ID":"a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb","Type":"ContainerStarted","Data":"38e51ef069b238075529603650313672cdb23452026741c0cc9d76a1b81d4f24"} Feb 26 20:07:50 crc kubenswrapper[4722]: I0226 20:07:50.213414 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" podStartSLOduration=2.4243588369999998 podStartE2EDuration="5.213369527s" podCreationTimestamp="2026-02-26 20:07:45 +0000 UTC" firstStartedPulling="2026-02-26 20:07:46.661086126 +0000 UTC m=+809.198054050" lastFinishedPulling="2026-02-26 20:07:49.450096816 +0000 UTC m=+811.987064740" observedRunningTime="2026-02-26 20:07:50.212953076 +0000 UTC m=+812.749921000" watchObservedRunningTime="2026-02-26 20:07:50.213369527 +0000 UTC m=+812.750337461" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.120847 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-w2rfd"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.122331 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.124046 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-t6vsv" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.128813 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.129789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.133228 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-w2rfd"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.135285 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.148708 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.169403 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-m7dz9"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.170244 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.213438 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx4fw\" (UniqueName: \"kubernetes.io/projected/65a85ed5-3f32-48e8-95b3-4576eb4ae0ea-kube-api-access-qx4fw\") pod \"nmstate-metrics-69594cc75-w2rfd\" (UID: \"65a85ed5-3f32-48e8-95b3-4576eb4ae0ea\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.260868 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.261586 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.263267 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-m5m7j" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.263650 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.264803 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.308712 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314302 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs9hq\" (UniqueName: \"kubernetes.io/projected/92200730-c944-47cc-bed8-8f8f7ac84819-kube-api-access-bs9hq\") pod \"nmstate-webhook-786f45cff4-fqbwr\" (UID: \"92200730-c944-47cc-bed8-8f8f7ac84819\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-dbus-socket\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-nmstate-lock\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314409 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-ovs-socket\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314578 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvk5z\" (UniqueName: \"kubernetes.io/projected/fae3dc9f-133c-42a5-82ef-23750fb2ffec-kube-api-access-rvk5z\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314613 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx4fw\" (UniqueName: \"kubernetes.io/projected/65a85ed5-3f32-48e8-95b3-4576eb4ae0ea-kube-api-access-qx4fw\") pod \"nmstate-metrics-69594cc75-w2rfd\" (UID: \"65a85ed5-3f32-48e8-95b3-4576eb4ae0ea\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/92200730-c944-47cc-bed8-8f8f7ac84819-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-fqbwr\" (UID: \"92200730-c944-47cc-bed8-8f8f7ac84819\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.332853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx4fw\" (UniqueName: \"kubernetes.io/projected/65a85ed5-3f32-48e8-95b3-4576eb4ae0ea-kube-api-access-qx4fw\") pod \"nmstate-metrics-69594cc75-w2rfd\" (UID: \"65a85ed5-3f32-48e8-95b3-4576eb4ae0ea\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416214 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtj8l\" (UniqueName: \"kubernetes.io/projected/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-kube-api-access-vtj8l\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416262 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/92200730-c944-47cc-bed8-8f8f7ac84819-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-fqbwr\" (UID: \"92200730-c944-47cc-bed8-8f8f7ac84819\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416282 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416309 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs9hq\" (UniqueName: \"kubernetes.io/projected/92200730-c944-47cc-bed8-8f8f7ac84819-kube-api-access-bs9hq\") pod \"nmstate-webhook-786f45cff4-fqbwr\" (UID: \"92200730-c944-47cc-bed8-8f8f7ac84819\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416389 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-dbus-socket\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-nmstate-lock\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-ovs-socket\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-nmstate-lock\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416553 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvk5z\" (UniqueName: \"kubernetes.io/projected/fae3dc9f-133c-42a5-82ef-23750fb2ffec-kube-api-access-rvk5z\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416570 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-ovs-socket\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416578 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416740 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-dbus-socket\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.422934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/92200730-c944-47cc-bed8-8f8f7ac84819-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-fqbwr\" (UID: \"92200730-c944-47cc-bed8-8f8f7ac84819\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.435904 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvk5z\" (UniqueName: \"kubernetes.io/projected/fae3dc9f-133c-42a5-82ef-23750fb2ffec-kube-api-access-rvk5z\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.440470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs9hq\" (UniqueName: \"kubernetes.io/projected/92200730-c944-47cc-bed8-8f8f7ac84819-kube-api-access-bs9hq\") pod \"nmstate-webhook-786f45cff4-fqbwr\" (UID: \"92200730-c944-47cc-bed8-8f8f7ac84819\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.440774 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.449617 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.462433 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cf97f8476-44v57"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.463098 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.484007 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cf97f8476-44v57"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.488374 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: W0226 20:07:51.513915 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfae3dc9f_133c_42a5_82ef_23750fb2ffec.slice/crio-b49e0c3fdf0b39661cee4761dc7bf22e0f5ed4c3567f36902c1c5688a9ee62e9 WatchSource:0}: Error finding container b49e0c3fdf0b39661cee4761dc7bf22e0f5ed4c3567f36902c1c5688a9ee62e9: Status 404 returned error can't find the container with id b49e0c3fdf0b39661cee4761dc7bf22e0f5ed4c3567f36902c1c5688a9ee62e9 Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.517860 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtj8l\" (UniqueName: \"kubernetes.io/projected/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-kube-api-access-vtj8l\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.517898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.517953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.520880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.525366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.537645 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtj8l\" (UniqueName: \"kubernetes.io/projected/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-kube-api-access-vtj8l\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.576553 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.619826 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-oauth-serving-cert\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.619874 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-console-config\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.620293 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9ztx\" (UniqueName: \"kubernetes.io/projected/806e3405-66f1-447a-8c9b-ba154b44a8da-kube-api-access-j9ztx\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.620376 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-trusted-ca-bundle\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.620426 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-service-ca\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.620495 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/806e3405-66f1-447a-8c9b-ba154b44a8da-console-serving-cert\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.620513 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/806e3405-66f1-447a-8c9b-ba154b44a8da-console-oauth-config\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721461 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9ztx\" (UniqueName: \"kubernetes.io/projected/806e3405-66f1-447a-8c9b-ba154b44a8da-kube-api-access-j9ztx\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721802 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-trusted-ca-bundle\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721827 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-service-ca\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/806e3405-66f1-447a-8c9b-ba154b44a8da-console-serving-cert\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721868 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/806e3405-66f1-447a-8c9b-ba154b44a8da-console-oauth-config\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721892 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-oauth-serving-cert\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721909 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-console-config\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.723698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-console-config\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.723747 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-service-ca\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.724593 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-oauth-serving-cert\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.729333 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/806e3405-66f1-447a-8c9b-ba154b44a8da-console-serving-cert\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.731350 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-trusted-ca-bundle\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.735896 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-w2rfd"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.737441 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/806e3405-66f1-447a-8c9b-ba154b44a8da-console-oauth-config\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: W0226 20:07:51.740150 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65a85ed5_3f32_48e8_95b3_4576eb4ae0ea.slice/crio-f8f6def56e419cb946a27b6bef0e9bcc0fc466e5e5309a59bf4f187f4c4bbe68 WatchSource:0}: Error finding container f8f6def56e419cb946a27b6bef0e9bcc0fc466e5e5309a59bf4f187f4c4bbe68: Status 404 returned error can't find the container with id f8f6def56e419cb946a27b6bef0e9bcc0fc466e5e5309a59bf4f187f4c4bbe68 Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.745762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9ztx\" (UniqueName: \"kubernetes.io/projected/806e3405-66f1-447a-8c9b-ba154b44a8da-kube-api-access-j9ztx\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.809066 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.812719 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr"] Feb 26 20:07:51 crc kubenswrapper[4722]: W0226 20:07:51.836269 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92200730_c944_47cc_bed8_8f8f7ac84819.slice/crio-ab689448c4cc371d8b4d4cb426f9f419da2d0067ec6f0a2d7a28f778468b9135 WatchSource:0}: Error finding container ab689448c4cc371d8b4d4cb426f9f419da2d0067ec6f0a2d7a28f778468b9135: Status 404 returned error can't find the container with id ab689448c4cc371d8b4d4cb426f9f419da2d0067ec6f0a2d7a28f778468b9135 Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.016795 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cf97f8476-44v57"] Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.084659 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5"] Feb 26 20:07:52 crc kubenswrapper[4722]: W0226 20:07:52.094461 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29b96d96_cf6b_46a4_89c5_4a9e1b2669c7.slice/crio-0221b74ddb5bdc32fae3fa6a081e009bd61b760c690d5ce55a1df9cbac4aa045 WatchSource:0}: Error finding container 0221b74ddb5bdc32fae3fa6a081e009bd61b760c690d5ce55a1df9cbac4aa045: Status 404 returned error can't find the container with id 0221b74ddb5bdc32fae3fa6a081e009bd61b760c690d5ce55a1df9cbac4aa045 Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.204245 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf97f8476-44v57" event={"ID":"806e3405-66f1-447a-8c9b-ba154b44a8da","Type":"ContainerStarted","Data":"671b69c85e08c3a1145a11673fa5bf299d3bc630be73120ce0c52ee5f95b24f4"} Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.205194 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" event={"ID":"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7","Type":"ContainerStarted","Data":"0221b74ddb5bdc32fae3fa6a081e009bd61b760c690d5ce55a1df9cbac4aa045"} Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.206177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" event={"ID":"65a85ed5-3f32-48e8-95b3-4576eb4ae0ea","Type":"ContainerStarted","Data":"f8f6def56e419cb946a27b6bef0e9bcc0fc466e5e5309a59bf4f187f4c4bbe68"} Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.207122 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" event={"ID":"92200730-c944-47cc-bed8-8f8f7ac84819","Type":"ContainerStarted","Data":"ab689448c4cc371d8b4d4cb426f9f419da2d0067ec6f0a2d7a28f778468b9135"} Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.208241 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-m7dz9" event={"ID":"fae3dc9f-133c-42a5-82ef-23750fb2ffec","Type":"ContainerStarted","Data":"b49e0c3fdf0b39661cee4761dc7bf22e0f5ed4c3567f36902c1c5688a9ee62e9"} Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.997922 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.998228 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.035320 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.217385 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf97f8476-44v57" event={"ID":"806e3405-66f1-447a-8c9b-ba154b44a8da","Type":"ContainerStarted","Data":"12f5a567645004ae8a3ecbe18499436ed8024a041185faefd8154373ffee479e"} Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.237174 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cf97f8476-44v57" podStartSLOduration=2.237154882 podStartE2EDuration="2.237154882s" podCreationTimestamp="2026-02-26 20:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:07:53.234340926 +0000 UTC m=+815.771308860" watchObservedRunningTime="2026-02-26 20:07:53.237154882 +0000 UTC m=+815.774122826" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.256388 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.487280 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.487653 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.487704 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.488326 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12e92002147a6bed28558e812784c0c72814bfcf24c4c83a3ce08703dfb08d58"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.488422 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://12e92002147a6bed28558e812784c0c72814bfcf24c4c83a3ce08703dfb08d58" gracePeriod=600 Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.871876 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g2lbh"] Feb 26 20:07:54 crc kubenswrapper[4722]: I0226 20:07:54.225795 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="12e92002147a6bed28558e812784c0c72814bfcf24c4c83a3ce08703dfb08d58" exitCode=0 Feb 26 20:07:54 crc kubenswrapper[4722]: I0226 20:07:54.225838 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"12e92002147a6bed28558e812784c0c72814bfcf24c4c83a3ce08703dfb08d58"} Feb 26 20:07:54 crc kubenswrapper[4722]: I0226 20:07:54.225905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"28eb66ca582ac12b359d92edbe11f70ad050a32628a627f71feab854f56a89c5"} Feb 26 20:07:54 crc kubenswrapper[4722]: I0226 20:07:54.225926 4722 scope.go:117] "RemoveContainer" containerID="8f8691f5d42ef337a84ad746773dcdfd71aecf3b13702ddd9fa1dda11224c081" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.232513 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" event={"ID":"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7","Type":"ContainerStarted","Data":"71bb773595fcc93c7c5d8b8348747b39720dac041159fb34d492b263793ec6ea"} Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.234323 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" event={"ID":"65a85ed5-3f32-48e8-95b3-4576eb4ae0ea","Type":"ContainerStarted","Data":"c37af38d02c23c4a1f39f6d46475c6311a4f2eced7924554439531d14f55cbac"} Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.238013 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" event={"ID":"92200730-c944-47cc-bed8-8f8f7ac84819","Type":"ContainerStarted","Data":"b3a7d329e6a360c8050479c7743ef4ba6e22510455127e604c6dc43402433460"} Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.238151 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.240025 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g2lbh" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="registry-server" containerID="cri-o://e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24" gracePeriod=2 Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.240801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-m7dz9" event={"ID":"fae3dc9f-133c-42a5-82ef-23750fb2ffec","Type":"ContainerStarted","Data":"e0d7e1387225f5fe8943f8370c35725dde95630662f163caab861f7af7fdff57"} Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.240831 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.249026 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" podStartSLOduration=1.530681355 podStartE2EDuration="4.249005715s" podCreationTimestamp="2026-02-26 20:07:51 +0000 UTC" firstStartedPulling="2026-02-26 20:07:52.09727095 +0000 UTC m=+814.634238874" lastFinishedPulling="2026-02-26 20:07:54.81559531 +0000 UTC m=+817.352563234" observedRunningTime="2026-02-26 20:07:55.244867173 +0000 UTC m=+817.781835097" watchObservedRunningTime="2026-02-26 20:07:55.249005715 +0000 UTC m=+817.785973629" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.293500 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" podStartSLOduration=1.27967092 podStartE2EDuration="4.293463236s" podCreationTimestamp="2026-02-26 20:07:51 +0000 UTC" firstStartedPulling="2026-02-26 20:07:51.841397404 +0000 UTC m=+814.378365328" lastFinishedPulling="2026-02-26 20:07:54.85518972 +0000 UTC m=+817.392157644" observedRunningTime="2026-02-26 20:07:55.288717609 +0000 UTC m=+817.825685543" watchObservedRunningTime="2026-02-26 20:07:55.293463236 +0000 UTC m=+817.830431180" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.309916 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-m7dz9" podStartSLOduration=1.006196479 podStartE2EDuration="4.309895801s" podCreationTimestamp="2026-02-26 20:07:51 +0000 UTC" firstStartedPulling="2026-02-26 20:07:51.515939677 +0000 UTC m=+814.052907591" lastFinishedPulling="2026-02-26 20:07:54.819638989 +0000 UTC m=+817.356606913" observedRunningTime="2026-02-26 20:07:55.307846266 +0000 UTC m=+817.844814200" watchObservedRunningTime="2026-02-26 20:07:55.309895801 +0000 UTC m=+817.846863725" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.563573 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.677894 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxsws\" (UniqueName: \"kubernetes.io/projected/d4cfa957-34b0-4b59-a010-4cfb763f0564-kube-api-access-sxsws\") pod \"d4cfa957-34b0-4b59-a010-4cfb763f0564\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.678242 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-catalog-content\") pod \"d4cfa957-34b0-4b59-a010-4cfb763f0564\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.678358 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-utilities\") pod \"d4cfa957-34b0-4b59-a010-4cfb763f0564\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.679261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-utilities" (OuterVolumeSpecName: "utilities") pod "d4cfa957-34b0-4b59-a010-4cfb763f0564" (UID: "d4cfa957-34b0-4b59-a010-4cfb763f0564"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.683380 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4cfa957-34b0-4b59-a010-4cfb763f0564-kube-api-access-sxsws" (OuterVolumeSpecName: "kube-api-access-sxsws") pod "d4cfa957-34b0-4b59-a010-4cfb763f0564" (UID: "d4cfa957-34b0-4b59-a010-4cfb763f0564"). InnerVolumeSpecName "kube-api-access-sxsws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.779874 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.779921 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxsws\" (UniqueName: \"kubernetes.io/projected/d4cfa957-34b0-4b59-a010-4cfb763f0564-kube-api-access-sxsws\") on node \"crc\" DevicePath \"\"" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.252748 4722 generic.go:334] "Generic (PLEG): container finished" podID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerID="e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24" exitCode=0 Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.252906 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.253532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2lbh" event={"ID":"d4cfa957-34b0-4b59-a010-4cfb763f0564","Type":"ContainerDied","Data":"e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24"} Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.253556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2lbh" event={"ID":"d4cfa957-34b0-4b59-a010-4cfb763f0564","Type":"ContainerDied","Data":"38e167c2e244696bea734e2faa94be954c00560998499fd0aff68b2debca0404"} Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.253572 4722 scope.go:117] "RemoveContainer" containerID="e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.276966 4722 scope.go:117] "RemoveContainer" containerID="d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.297680 4722 scope.go:117] "RemoveContainer" containerID="ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.331959 4722 scope.go:117] "RemoveContainer" containerID="e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24" Feb 26 20:07:56 crc kubenswrapper[4722]: E0226 20:07:56.332939 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24\": container with ID starting with e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24 not found: ID does not exist" containerID="e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.333083 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24"} err="failed to get container status \"e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24\": rpc error: code = NotFound desc = could not find container \"e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24\": container with ID starting with e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24 not found: ID does not exist" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.333289 4722 scope.go:117] "RemoveContainer" containerID="d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e" Feb 26 20:07:56 crc kubenswrapper[4722]: E0226 20:07:56.333919 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e\": container with ID starting with d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e not found: ID does not exist" containerID="d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.333966 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e"} err="failed to get container status \"d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e\": rpc error: code = NotFound desc = could not find container \"d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e\": container with ID starting with d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e not found: ID does not exist" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.333989 4722 scope.go:117] "RemoveContainer" containerID="ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2" Feb 26 20:07:56 crc kubenswrapper[4722]: E0226 20:07:56.334790 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2\": container with ID starting with ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2 not found: ID does not exist" containerID="ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.334870 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2"} err="failed to get container status \"ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2\": rpc error: code = NotFound desc = could not find container \"ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2\": container with ID starting with ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2 not found: ID does not exist" Feb 26 20:07:57 crc kubenswrapper[4722]: I0226 20:07:57.160272 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4cfa957-34b0-4b59-a010-4cfb763f0564" (UID: "d4cfa957-34b0-4b59-a010-4cfb763f0564"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:07:57 crc kubenswrapper[4722]: I0226 20:07:57.201698 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:07:57 crc kubenswrapper[4722]: I0226 20:07:57.484225 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g2lbh"] Feb 26 20:07:57 crc kubenswrapper[4722]: I0226 20:07:57.487271 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g2lbh"] Feb 26 20:07:58 crc kubenswrapper[4722]: I0226 20:07:58.152842 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" path="/var/lib/kubelet/pods/d4cfa957-34b0-4b59-a010-4cfb763f0564/volumes" Feb 26 20:07:58 crc kubenswrapper[4722]: I0226 20:07:58.269220 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" event={"ID":"65a85ed5-3f32-48e8-95b3-4576eb4ae0ea","Type":"ContainerStarted","Data":"d348e624d87ba4a5866eaf15f07c6de6011bfaada133ce875e53f89400c534f5"} Feb 26 20:07:58 crc kubenswrapper[4722]: I0226 20:07:58.286402 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" podStartSLOduration=1.014279317 podStartE2EDuration="7.286385198s" podCreationTimestamp="2026-02-26 20:07:51 +0000 UTC" firstStartedPulling="2026-02-26 20:07:51.746331544 +0000 UTC m=+814.283299468" lastFinishedPulling="2026-02-26 20:07:58.018437425 +0000 UTC m=+820.555405349" observedRunningTime="2026-02-26 20:07:58.282494953 +0000 UTC m=+820.819462877" watchObservedRunningTime="2026-02-26 20:07:58.286385198 +0000 UTC m=+820.823353122" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.128913 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535608-fsxp2"] Feb 26 20:08:00 crc kubenswrapper[4722]: E0226 20:08:00.129619 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="extract-content" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.129638 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="extract-content" Feb 26 20:08:00 crc kubenswrapper[4722]: E0226 20:08:00.129667 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="extract-utilities" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.129679 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="extract-utilities" Feb 26 20:08:00 crc kubenswrapper[4722]: E0226 20:08:00.129704 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="registry-server" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.129716 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="registry-server" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.129905 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="registry-server" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.130557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.133693 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.133968 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.133706 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535608-fsxp2"] Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.133976 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.141107 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh54x\" (UniqueName: \"kubernetes.io/projected/d8f7c080-b1b3-4173-8cad-c6d58715daf2-kube-api-access-vh54x\") pod \"auto-csr-approver-29535608-fsxp2\" (UID: \"d8f7c080-b1b3-4173-8cad-c6d58715daf2\") " pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.241993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh54x\" (UniqueName: \"kubernetes.io/projected/d8f7c080-b1b3-4173-8cad-c6d58715daf2-kube-api-access-vh54x\") pod \"auto-csr-approver-29535608-fsxp2\" (UID: \"d8f7c080-b1b3-4173-8cad-c6d58715daf2\") " pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.274288 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh54x\" (UniqueName: \"kubernetes.io/projected/d8f7c080-b1b3-4173-8cad-c6d58715daf2-kube-api-access-vh54x\") pod \"auto-csr-approver-29535608-fsxp2\" (UID: \"d8f7c080-b1b3-4173-8cad-c6d58715daf2\") " pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.468255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.862873 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535608-fsxp2"] Feb 26 20:08:00 crc kubenswrapper[4722]: W0226 20:08:00.878330 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8f7c080_b1b3_4173_8cad_c6d58715daf2.slice/crio-02506c8e0ca89d95ec8f9241ed556204a0030452faadfc78bf28a06d145afc52 WatchSource:0}: Error finding container 02506c8e0ca89d95ec8f9241ed556204a0030452faadfc78bf28a06d145afc52: Status 404 returned error can't find the container with id 02506c8e0ca89d95ec8f9241ed556204a0030452faadfc78bf28a06d145afc52 Feb 26 20:08:01 crc kubenswrapper[4722]: I0226 20:08:01.287127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" event={"ID":"d8f7c080-b1b3-4173-8cad-c6d58715daf2","Type":"ContainerStarted","Data":"02506c8e0ca89d95ec8f9241ed556204a0030452faadfc78bf28a06d145afc52"} Feb 26 20:08:01 crc kubenswrapper[4722]: I0226 20:08:01.528982 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:08:01 crc kubenswrapper[4722]: I0226 20:08:01.810559 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:08:01 crc kubenswrapper[4722]: I0226 20:08:01.811498 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:08:01 crc kubenswrapper[4722]: I0226 20:08:01.816502 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.227861 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tsqv2"] Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.229330 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.239654 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsqv2"] Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.299317 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.367781 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-n77d2"] Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.369636 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-catalog-content\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.369745 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-utilities\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.369787 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx58d\" (UniqueName: \"kubernetes.io/projected/04d5d931-706d-40ca-83ae-23333efa3655-kube-api-access-jx58d\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.472095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-utilities\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.472338 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx58d\" (UniqueName: \"kubernetes.io/projected/04d5d931-706d-40ca-83ae-23333efa3655-kube-api-access-jx58d\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.472494 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-catalog-content\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.472976 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-utilities\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.473346 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-catalog-content\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.496715 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx58d\" (UniqueName: \"kubernetes.io/projected/04d5d931-706d-40ca-83ae-23333efa3655-kube-api-access-jx58d\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.555319 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.966336 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsqv2"] Feb 26 20:08:02 crc kubenswrapper[4722]: W0226 20:08:02.970599 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04d5d931_706d_40ca_83ae_23333efa3655.slice/crio-92d3cb65f16d299f57ee849275673f90d023dbc2f119d5351a10b3d6cfca1cf9 WatchSource:0}: Error finding container 92d3cb65f16d299f57ee849275673f90d023dbc2f119d5351a10b3d6cfca1cf9: Status 404 returned error can't find the container with id 92d3cb65f16d299f57ee849275673f90d023dbc2f119d5351a10b3d6cfca1cf9 Feb 26 20:08:03 crc kubenswrapper[4722]: I0226 20:08:03.300664 4722 generic.go:334] "Generic (PLEG): container finished" podID="04d5d931-706d-40ca-83ae-23333efa3655" containerID="2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec" exitCode=0 Feb 26 20:08:03 crc kubenswrapper[4722]: I0226 20:08:03.300729 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsqv2" event={"ID":"04d5d931-706d-40ca-83ae-23333efa3655","Type":"ContainerDied","Data":"2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec"} Feb 26 20:08:03 crc kubenswrapper[4722]: I0226 20:08:03.300754 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsqv2" event={"ID":"04d5d931-706d-40ca-83ae-23333efa3655","Type":"ContainerStarted","Data":"92d3cb65f16d299f57ee849275673f90d023dbc2f119d5351a10b3d6cfca1cf9"} Feb 26 20:08:03 crc kubenswrapper[4722]: I0226 20:08:03.302167 4722 generic.go:334] "Generic (PLEG): container finished" podID="d8f7c080-b1b3-4173-8cad-c6d58715daf2" containerID="8c622b469f8138308a9cbdc0290940b2c8c2133097793fb4b0c20d724843c278" exitCode=0 Feb 26 20:08:03 crc kubenswrapper[4722]: I0226 20:08:03.302806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" event={"ID":"d8f7c080-b1b3-4173-8cad-c6d58715daf2","Type":"ContainerDied","Data":"8c622b469f8138308a9cbdc0290940b2c8c2133097793fb4b0c20d724843c278"} Feb 26 20:08:04 crc kubenswrapper[4722]: I0226 20:08:04.310242 4722 generic.go:334] "Generic (PLEG): container finished" podID="04d5d931-706d-40ca-83ae-23333efa3655" containerID="d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70" exitCode=0 Feb 26 20:08:04 crc kubenswrapper[4722]: I0226 20:08:04.310336 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsqv2" event={"ID":"04d5d931-706d-40ca-83ae-23333efa3655","Type":"ContainerDied","Data":"d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70"} Feb 26 20:08:04 crc kubenswrapper[4722]: I0226 20:08:04.591216 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:04 crc kubenswrapper[4722]: I0226 20:08:04.704122 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh54x\" (UniqueName: \"kubernetes.io/projected/d8f7c080-b1b3-4173-8cad-c6d58715daf2-kube-api-access-vh54x\") pod \"d8f7c080-b1b3-4173-8cad-c6d58715daf2\" (UID: \"d8f7c080-b1b3-4173-8cad-c6d58715daf2\") " Feb 26 20:08:04 crc kubenswrapper[4722]: I0226 20:08:04.713305 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f7c080-b1b3-4173-8cad-c6d58715daf2-kube-api-access-vh54x" (OuterVolumeSpecName: "kube-api-access-vh54x") pod "d8f7c080-b1b3-4173-8cad-c6d58715daf2" (UID: "d8f7c080-b1b3-4173-8cad-c6d58715daf2"). InnerVolumeSpecName "kube-api-access-vh54x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:08:04 crc kubenswrapper[4722]: I0226 20:08:04.805869 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh54x\" (UniqueName: \"kubernetes.io/projected/d8f7c080-b1b3-4173-8cad-c6d58715daf2-kube-api-access-vh54x\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:05 crc kubenswrapper[4722]: I0226 20:08:05.324538 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" event={"ID":"d8f7c080-b1b3-4173-8cad-c6d58715daf2","Type":"ContainerDied","Data":"02506c8e0ca89d95ec8f9241ed556204a0030452faadfc78bf28a06d145afc52"} Feb 26 20:08:05 crc kubenswrapper[4722]: I0226 20:08:05.324584 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02506c8e0ca89d95ec8f9241ed556204a0030452faadfc78bf28a06d145afc52" Feb 26 20:08:05 crc kubenswrapper[4722]: I0226 20:08:05.324615 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:05 crc kubenswrapper[4722]: I0226 20:08:05.647167 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535602-9ksgl"] Feb 26 20:08:05 crc kubenswrapper[4722]: I0226 20:08:05.650654 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535602-9ksgl"] Feb 26 20:08:06 crc kubenswrapper[4722]: I0226 20:08:06.153279 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb5fc7ac-5083-4a8e-b290-a47ecd62ca66" path="/var/lib/kubelet/pods/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66/volumes" Feb 26 20:08:06 crc kubenswrapper[4722]: I0226 20:08:06.331847 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsqv2" event={"ID":"04d5d931-706d-40ca-83ae-23333efa3655","Type":"ContainerStarted","Data":"fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c"} Feb 26 20:08:06 crc kubenswrapper[4722]: I0226 20:08:06.355622 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tsqv2" podStartSLOduration=1.96561561 podStartE2EDuration="4.355600883s" podCreationTimestamp="2026-02-26 20:08:02 +0000 UTC" firstStartedPulling="2026-02-26 20:08:03.303330678 +0000 UTC m=+825.840298602" lastFinishedPulling="2026-02-26 20:08:05.693315931 +0000 UTC m=+828.230283875" observedRunningTime="2026-02-26 20:08:06.350169046 +0000 UTC m=+828.887137020" watchObservedRunningTime="2026-02-26 20:08:06.355600883 +0000 UTC m=+828.892568807" Feb 26 20:08:11 crc kubenswrapper[4722]: I0226 20:08:11.457312 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:08:12 crc kubenswrapper[4722]: I0226 20:08:12.556083 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:12 crc kubenswrapper[4722]: I0226 20:08:12.556188 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:12 crc kubenswrapper[4722]: I0226 20:08:12.594290 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:13 crc kubenswrapper[4722]: I0226 20:08:13.415470 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:13 crc kubenswrapper[4722]: I0226 20:08:13.826491 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsqv2"] Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.386374 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tsqv2" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="registry-server" containerID="cri-o://fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c" gracePeriod=2 Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.749259 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.760093 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-utilities\") pod \"04d5d931-706d-40ca-83ae-23333efa3655\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.760199 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-catalog-content\") pod \"04d5d931-706d-40ca-83ae-23333efa3655\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.760326 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx58d\" (UniqueName: \"kubernetes.io/projected/04d5d931-706d-40ca-83ae-23333efa3655-kube-api-access-jx58d\") pod \"04d5d931-706d-40ca-83ae-23333efa3655\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.761026 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-utilities" (OuterVolumeSpecName: "utilities") pod "04d5d931-706d-40ca-83ae-23333efa3655" (UID: "04d5d931-706d-40ca-83ae-23333efa3655"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.771341 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d5d931-706d-40ca-83ae-23333efa3655-kube-api-access-jx58d" (OuterVolumeSpecName: "kube-api-access-jx58d") pod "04d5d931-706d-40ca-83ae-23333efa3655" (UID: "04d5d931-706d-40ca-83ae-23333efa3655"). InnerVolumeSpecName "kube-api-access-jx58d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.802536 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04d5d931-706d-40ca-83ae-23333efa3655" (UID: "04d5d931-706d-40ca-83ae-23333efa3655"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.862072 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.862110 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx58d\" (UniqueName: \"kubernetes.io/projected/04d5d931-706d-40ca-83ae-23333efa3655-kube-api-access-jx58d\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.862127 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.395642 4722 generic.go:334] "Generic (PLEG): container finished" podID="04d5d931-706d-40ca-83ae-23333efa3655" containerID="fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c" exitCode=0 Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.395687 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsqv2" event={"ID":"04d5d931-706d-40ca-83ae-23333efa3655","Type":"ContainerDied","Data":"fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c"} Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.395711 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsqv2" event={"ID":"04d5d931-706d-40ca-83ae-23333efa3655","Type":"ContainerDied","Data":"92d3cb65f16d299f57ee849275673f90d023dbc2f119d5351a10b3d6cfca1cf9"} Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.395709 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.395752 4722 scope.go:117] "RemoveContainer" containerID="fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.414605 4722 scope.go:117] "RemoveContainer" containerID="d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.420939 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsqv2"] Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.425721 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsqv2"] Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.436465 4722 scope.go:117] "RemoveContainer" containerID="2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.458061 4722 scope.go:117] "RemoveContainer" containerID="fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c" Feb 26 20:08:16 crc kubenswrapper[4722]: E0226 20:08:16.458584 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c\": container with ID starting with fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c not found: ID does not exist" containerID="fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.458639 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c"} err="failed to get container status \"fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c\": rpc error: code = NotFound desc = could not find container \"fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c\": container with ID starting with fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c not found: ID does not exist" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.458671 4722 scope.go:117] "RemoveContainer" containerID="d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70" Feb 26 20:08:16 crc kubenswrapper[4722]: E0226 20:08:16.459004 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70\": container with ID starting with d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70 not found: ID does not exist" containerID="d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.459033 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70"} err="failed to get container status \"d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70\": rpc error: code = NotFound desc = could not find container \"d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70\": container with ID starting with d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70 not found: ID does not exist" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.459048 4722 scope.go:117] "RemoveContainer" containerID="2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec" Feb 26 20:08:16 crc kubenswrapper[4722]: E0226 20:08:16.459397 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec\": container with ID starting with 2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec not found: ID does not exist" containerID="2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.459437 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec"} err="failed to get container status \"2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec\": rpc error: code = NotFound desc = could not find container \"2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec\": container with ID starting with 2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec not found: ID does not exist" Feb 26 20:08:18 crc kubenswrapper[4722]: I0226 20:08:18.154673 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d5d931-706d-40ca-83ae-23333efa3655" path="/var/lib/kubelet/pods/04d5d931-706d-40ca-83ae-23333efa3655/volumes" Feb 26 20:08:18 crc kubenswrapper[4722]: I0226 20:08:18.698322 4722 scope.go:117] "RemoveContainer" containerID="5c490e51cd7a142717096d725e6c54df60bc8014504cb1037512fa976a9d7702" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.105516 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst"] Feb 26 20:08:24 crc kubenswrapper[4722]: E0226 20:08:24.106276 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f7c080-b1b3-4173-8cad-c6d58715daf2" containerName="oc" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.106288 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f7c080-b1b3-4173-8cad-c6d58715daf2" containerName="oc" Feb 26 20:08:24 crc kubenswrapper[4722]: E0226 20:08:24.106300 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="extract-utilities" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.106305 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="extract-utilities" Feb 26 20:08:24 crc kubenswrapper[4722]: E0226 20:08:24.106314 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="extract-content" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.106322 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="extract-content" Feb 26 20:08:24 crc kubenswrapper[4722]: E0226 20:08:24.106330 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="registry-server" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.106335 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="registry-server" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.106452 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="registry-server" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.106462 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f7c080-b1b3-4173-8cad-c6d58715daf2" containerName="oc" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.107337 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.109894 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.159604 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst"] Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.272910 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.272979 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.273318 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wb7k\" (UniqueName: \"kubernetes.io/projected/19b9313d-6174-4aec-b52a-d7820c305b2c-kube-api-access-4wb7k\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.374801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wb7k\" (UniqueName: \"kubernetes.io/projected/19b9313d-6174-4aec-b52a-d7820c305b2c-kube-api-access-4wb7k\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.374849 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.374883 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.375418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.375437 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.397716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wb7k\" (UniqueName: \"kubernetes.io/projected/19b9313d-6174-4aec-b52a-d7820c305b2c-kube-api-access-4wb7k\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.449586 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.851455 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst"] Feb 26 20:08:25 crc kubenswrapper[4722]: I0226 20:08:25.470501 4722 generic.go:334] "Generic (PLEG): container finished" podID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerID="7b74315faa7fd252565d60d2769fe7bf91e41dd84c9c707191e41aa76e86f519" exitCode=0 Feb 26 20:08:25 crc kubenswrapper[4722]: I0226 20:08:25.470539 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" event={"ID":"19b9313d-6174-4aec-b52a-d7820c305b2c","Type":"ContainerDied","Data":"7b74315faa7fd252565d60d2769fe7bf91e41dd84c9c707191e41aa76e86f519"} Feb 26 20:08:25 crc kubenswrapper[4722]: I0226 20:08:25.470566 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" event={"ID":"19b9313d-6174-4aec-b52a-d7820c305b2c","Type":"ContainerStarted","Data":"8dc91ab78d965300a053f61ec05a70fd682fd6dabaeb89c57f2564420d7eda2c"} Feb 26 20:08:27 crc kubenswrapper[4722]: I0226 20:08:27.445923 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-n77d2" podUID="46842c31-3b12-4cbf-b722-327327cf8375" containerName="console" containerID="cri-o://4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1" gracePeriod=15 Feb 26 20:08:27 crc kubenswrapper[4722]: I0226 20:08:27.537593 4722 generic.go:334] "Generic (PLEG): container finished" podID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerID="ef280eddb353925c6cd8093fb93043926d2755f804d105e428356833a7d2c618" exitCode=0 Feb 26 20:08:27 crc kubenswrapper[4722]: I0226 20:08:27.537685 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" event={"ID":"19b9313d-6174-4aec-b52a-d7820c305b2c","Type":"ContainerDied","Data":"ef280eddb353925c6cd8093fb93043926d2755f804d105e428356833a7d2c618"} Feb 26 20:08:27 crc kubenswrapper[4722]: I0226 20:08:27.875268 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-n77d2_46842c31-3b12-4cbf-b722-327327cf8375/console/0.log" Feb 26 20:08:27 crc kubenswrapper[4722]: I0226 20:08:27.875506 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.045869 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-console-config\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.045967 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk7gj\" (UniqueName: \"kubernetes.io/projected/46842c31-3b12-4cbf-b722-327327cf8375-kube-api-access-bk7gj\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046045 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-serving-cert\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046094 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-service-ca\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-oauth-serving-cert\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046227 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-trusted-ca-bundle\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-oauth-config\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046828 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046847 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-console-config" (OuterVolumeSpecName: "console-config") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046843 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.047356 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-service-ca" (OuterVolumeSpecName: "service-ca") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.052725 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46842c31-3b12-4cbf-b722-327327cf8375-kube-api-access-bk7gj" (OuterVolumeSpecName: "kube-api-access-bk7gj") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "kube-api-access-bk7gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.052870 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.053345 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148131 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148217 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148238 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148260 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148280 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148298 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-console-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148316 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk7gj\" (UniqueName: \"kubernetes.io/projected/46842c31-3b12-4cbf-b722-327327cf8375-kube-api-access-bk7gj\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.548676 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-n77d2_46842c31-3b12-4cbf-b722-327327cf8375/console/0.log" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.548749 4722 generic.go:334] "Generic (PLEG): container finished" podID="46842c31-3b12-4cbf-b722-327327cf8375" containerID="4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1" exitCode=2 Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.548829 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n77d2" event={"ID":"46842c31-3b12-4cbf-b722-327327cf8375","Type":"ContainerDied","Data":"4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1"} Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.548872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n77d2" event={"ID":"46842c31-3b12-4cbf-b722-327327cf8375","Type":"ContainerDied","Data":"d676e23dbd02b3ce4c5e55cbc105fc4697d7335c5837c1d7914c22407cceb01b"} Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.548903 4722 scope.go:117] "RemoveContainer" containerID="4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.549039 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.558177 4722 generic.go:334] "Generic (PLEG): container finished" podID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerID="81759cecf8fc2d893661540734c9bf52804234f78284eac645542da2d51e06c4" exitCode=0 Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.558214 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" event={"ID":"19b9313d-6174-4aec-b52a-d7820c305b2c","Type":"ContainerDied","Data":"81759cecf8fc2d893661540734c9bf52804234f78284eac645542da2d51e06c4"} Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.576947 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-n77d2"] Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.579369 4722 scope.go:117] "RemoveContainer" containerID="4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1" Feb 26 20:08:28 crc kubenswrapper[4722]: E0226 20:08:28.579978 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1\": container with ID starting with 4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1 not found: ID does not exist" containerID="4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.580085 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1"} err="failed to get container status \"4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1\": rpc error: code = NotFound desc = could not find container \"4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1\": container with ID starting with 4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1 not found: ID does not exist" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.589944 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-n77d2"] Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.866406 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.977652 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-util\") pod \"19b9313d-6174-4aec-b52a-d7820c305b2c\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.977734 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-bundle\") pod \"19b9313d-6174-4aec-b52a-d7820c305b2c\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.977863 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wb7k\" (UniqueName: \"kubernetes.io/projected/19b9313d-6174-4aec-b52a-d7820c305b2c-kube-api-access-4wb7k\") pod \"19b9313d-6174-4aec-b52a-d7820c305b2c\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.978950 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-bundle" (OuterVolumeSpecName: "bundle") pod "19b9313d-6174-4aec-b52a-d7820c305b2c" (UID: "19b9313d-6174-4aec-b52a-d7820c305b2c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.986328 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b9313d-6174-4aec-b52a-d7820c305b2c-kube-api-access-4wb7k" (OuterVolumeSpecName: "kube-api-access-4wb7k") pod "19b9313d-6174-4aec-b52a-d7820c305b2c" (UID: "19b9313d-6174-4aec-b52a-d7820c305b2c"). InnerVolumeSpecName "kube-api-access-4wb7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.993257 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-util" (OuterVolumeSpecName: "util") pod "19b9313d-6174-4aec-b52a-d7820c305b2c" (UID: "19b9313d-6174-4aec-b52a-d7820c305b2c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.078724 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wb7k\" (UniqueName: \"kubernetes.io/projected/19b9313d-6174-4aec-b52a-d7820c305b2c-kube-api-access-4wb7k\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.078754 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-util\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.078765 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.154259 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46842c31-3b12-4cbf-b722-327327cf8375" path="/var/lib/kubelet/pods/46842c31-3b12-4cbf-b722-327327cf8375/volumes" Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.577219 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" event={"ID":"19b9313d-6174-4aec-b52a-d7820c305b2c","Type":"ContainerDied","Data":"8dc91ab78d965300a053f61ec05a70fd682fd6dabaeb89c57f2564420d7eda2c"} Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.577264 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dc91ab78d965300a053f61ec05a70fd682fd6dabaeb89c57f2564420d7eda2c" Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.577358 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820015 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z"] Feb 26 20:08:38 crc kubenswrapper[4722]: E0226 20:08:38.820739 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="pull" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820753 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="pull" Feb 26 20:08:38 crc kubenswrapper[4722]: E0226 20:08:38.820765 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="util" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820770 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="util" Feb 26 20:08:38 crc kubenswrapper[4722]: E0226 20:08:38.820789 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46842c31-3b12-4cbf-b722-327327cf8375" containerName="console" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820795 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="46842c31-3b12-4cbf-b722-327327cf8375" containerName="console" Feb 26 20:08:38 crc kubenswrapper[4722]: E0226 20:08:38.820805 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="extract" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820810 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="extract" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820909 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="46842c31-3b12-4cbf-b722-327327cf8375" containerName="console" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820922 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="extract" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.821396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.824168 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.824248 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-m7grk" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.825032 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.826130 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.830678 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.843512 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z"] Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.989551 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-apiservice-cert\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.989912 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k722b\" (UniqueName: \"kubernetes.io/projected/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-kube-api-access-k722b\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.989942 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-webhook-cert\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.063885 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb"] Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.064821 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.066693 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.067185 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.067402 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xn9v9" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.077664 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb"] Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.090732 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-apiservice-cert\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.090786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k722b\" (UniqueName: \"kubernetes.io/projected/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-kube-api-access-k722b\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.090826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-webhook-cert\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.102696 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-apiservice-cert\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.106741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-webhook-cert\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.126941 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k722b\" (UniqueName: \"kubernetes.io/projected/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-kube-api-access-k722b\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.174544 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.194118 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-apiservice-cert\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.194283 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-442kf\" (UniqueName: \"kubernetes.io/projected/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-kube-api-access-442kf\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.194346 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-webhook-cert\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.295465 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-442kf\" (UniqueName: \"kubernetes.io/projected/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-kube-api-access-442kf\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.295540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-webhook-cert\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.295590 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-apiservice-cert\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.300925 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-apiservice-cert\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.301608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-webhook-cert\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.313526 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-442kf\" (UniqueName: \"kubernetes.io/projected/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-kube-api-access-442kf\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.377903 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.413657 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z"] Feb 26 20:08:39 crc kubenswrapper[4722]: W0226 20:08:39.425491 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52abafd1_b7e2_4dcc_85dd_d4dd5abd0c2d.slice/crio-5874fefff4a0b8b9c76eeae77fd648daf911a6cf81f0bfaa45a9b95e02964acb WatchSource:0}: Error finding container 5874fefff4a0b8b9c76eeae77fd648daf911a6cf81f0bfaa45a9b95e02964acb: Status 404 returned error can't find the container with id 5874fefff4a0b8b9c76eeae77fd648daf911a6cf81f0bfaa45a9b95e02964acb Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.625662 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" event={"ID":"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d","Type":"ContainerStarted","Data":"5874fefff4a0b8b9c76eeae77fd648daf911a6cf81f0bfaa45a9b95e02964acb"} Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.695388 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb"] Feb 26 20:08:39 crc kubenswrapper[4722]: W0226 20:08:39.703359 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fe1c7f0_4dea_4bd4_bcfc_c9e4486ec09b.slice/crio-6cc95b2468885197d9b2abc4c51c1bcafbbe983d8daab635b95f1c59ca4923bc WatchSource:0}: Error finding container 6cc95b2468885197d9b2abc4c51c1bcafbbe983d8daab635b95f1c59ca4923bc: Status 404 returned error can't find the container with id 6cc95b2468885197d9b2abc4c51c1bcafbbe983d8daab635b95f1c59ca4923bc Feb 26 20:08:40 crc kubenswrapper[4722]: I0226 20:08:40.632070 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" event={"ID":"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b","Type":"ContainerStarted","Data":"6cc95b2468885197d9b2abc4c51c1bcafbbe983d8daab635b95f1c59ca4923bc"} Feb 26 20:08:42 crc kubenswrapper[4722]: I0226 20:08:42.646523 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" event={"ID":"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d","Type":"ContainerStarted","Data":"e531e7e3d117b933062940756affeea4312e7fd413b6c16b79c097e2cef3e247"} Feb 26 20:08:42 crc kubenswrapper[4722]: I0226 20:08:42.646784 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:44 crc kubenswrapper[4722]: I0226 20:08:44.660700 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" event={"ID":"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b","Type":"ContainerStarted","Data":"3a1f7caf0b1d359bb23dc1bcf0a04880b7b730eff4b1d15aadc65f9f4a3e3eb1"} Feb 26 20:08:44 crc kubenswrapper[4722]: I0226 20:08:44.661050 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:44 crc kubenswrapper[4722]: I0226 20:08:44.686932 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" podStartSLOduration=1.2506167750000001 podStartE2EDuration="5.686917391s" podCreationTimestamp="2026-02-26 20:08:39 +0000 UTC" firstStartedPulling="2026-02-26 20:08:39.706482014 +0000 UTC m=+862.243449938" lastFinishedPulling="2026-02-26 20:08:44.14278263 +0000 UTC m=+866.679750554" observedRunningTime="2026-02-26 20:08:44.686020877 +0000 UTC m=+867.222988801" watchObservedRunningTime="2026-02-26 20:08:44.686917391 +0000 UTC m=+867.223885315" Feb 26 20:08:44 crc kubenswrapper[4722]: I0226 20:08:44.689852 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" podStartSLOduration=3.721004702 podStartE2EDuration="6.689845459s" podCreationTimestamp="2026-02-26 20:08:38 +0000 UTC" firstStartedPulling="2026-02-26 20:08:39.435252322 +0000 UTC m=+861.972220246" lastFinishedPulling="2026-02-26 20:08:42.404093079 +0000 UTC m=+864.941061003" observedRunningTime="2026-02-26 20:08:42.672335161 +0000 UTC m=+865.209303105" watchObservedRunningTime="2026-02-26 20:08:44.689845459 +0000 UTC m=+867.226813383" Feb 26 20:08:59 crc kubenswrapper[4722]: I0226 20:08:59.384321 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.179892 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.918286 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7"] Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.919030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.924251 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-h42tc" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.924325 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.926751 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-l46cn"] Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.929055 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.932593 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.935672 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.985495 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7"] Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.006877 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-q9jh2"] Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.008105 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.011210 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-gm8tf" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.011937 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.011955 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.013850 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.038665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-frr-sockets\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.038798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-frr-conf\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.038835 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5fhv\" (UniqueName: \"kubernetes.io/projected/0ee913a7-6a3f-46e5-99f8-d405722ef55e-kube-api-access-b5fhv\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.038897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-metrics\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.038951 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rzn9\" (UniqueName: \"kubernetes.io/projected/0a425713-23b7-4347-96b0-c4736712d0ab-kube-api-access-2rzn9\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.039011 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee913a7-6a3f-46e5-99f8-d405722ef55e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.039154 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0a425713-23b7-4347-96b0-c4736712d0ab-frr-startup\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.039177 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a425713-23b7-4347-96b0-c4736712d0ab-metrics-certs\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.039236 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-reloader\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.042847 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-gpj96"] Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.043940 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.047784 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.054672 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-gpj96"] Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141116 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-frr-conf\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141192 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5fhv\" (UniqueName: \"kubernetes.io/projected/0ee913a7-6a3f-46e5-99f8-d405722ef55e-kube-api-access-b5fhv\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141240 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-metrics\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141266 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-metrics-certs\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141292 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rzn9\" (UniqueName: \"kubernetes.io/projected/0a425713-23b7-4347-96b0-c4736712d0ab-kube-api-access-2rzn9\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee913a7-6a3f-46e5-99f8-d405722ef55e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141398 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/de675145-f60b-4c0c-b5c9-ef0b33e10c29-metallb-excludel2\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141450 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0a425713-23b7-4347-96b0-c4736712d0ab-frr-startup\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141493 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a425713-23b7-4347-96b0-c4736712d0ab-metrics-certs\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141524 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-reloader\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141574 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141734 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-frr-conf\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.141848 4722 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.142181 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee913a7-6a3f-46e5-99f8-d405722ef55e-cert podName:0ee913a7-6a3f-46e5-99f8-d405722ef55e nodeName:}" failed. No retries permitted until 2026-02-26 20:09:20.642159775 +0000 UTC m=+903.179127709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee913a7-6a3f-46e5-99f8-d405722ef55e-cert") pod "frr-k8s-webhook-server-7f989f654f-s8rl7" (UID: "0ee913a7-6a3f-46e5-99f8-d405722ef55e") : secret "frr-k8s-webhook-server-cert" not found Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.142327 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-metrics\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.142433 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7kj2\" (UniqueName: \"kubernetes.io/projected/de675145-f60b-4c0c-b5c9-ef0b33e10c29-kube-api-access-k7kj2\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.142474 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-frr-sockets\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.142793 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-frr-sockets\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.142823 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-reloader\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.142880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0a425713-23b7-4347-96b0-c4736712d0ab-frr-startup\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.147715 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a425713-23b7-4347-96b0-c4736712d0ab-metrics-certs\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.162595 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rzn9\" (UniqueName: \"kubernetes.io/projected/0a425713-23b7-4347-96b0-c4736712d0ab-kube-api-access-2rzn9\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.171937 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5fhv\" (UniqueName: \"kubernetes.io/projected/0ee913a7-6a3f-46e5-99f8-d405722ef55e-kube-api-access-b5fhv\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.242572 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243346 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243380 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-metrics-certs\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243407 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7kj2\" (UniqueName: \"kubernetes.io/projected/de675145-f60b-4c0c-b5c9-ef0b33e10c29-kube-api-access-k7kj2\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243429 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-cert\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.243513 4722 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243526 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-metrics-certs\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.243559 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist podName:de675145-f60b-4c0c-b5c9-ef0b33e10c29 nodeName:}" failed. No retries permitted until 2026-02-26 20:09:20.743544301 +0000 UTC m=+903.280512225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist") pod "speaker-q9jh2" (UID: "de675145-f60b-4c0c-b5c9-ef0b33e10c29") : secret "metallb-memberlist" not found Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243629 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/de675145-f60b-4c0c-b5c9-ef0b33e10c29-metallb-excludel2\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243674 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwhsx\" (UniqueName: \"kubernetes.io/projected/80c4aae3-6c63-43f6-8dcb-46e953562c67-kube-api-access-mwhsx\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.244559 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/de675145-f60b-4c0c-b5c9-ef0b33e10c29-metallb-excludel2\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.246556 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-metrics-certs\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.259671 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7kj2\" (UniqueName: \"kubernetes.io/projected/de675145-f60b-4c0c-b5c9-ef0b33e10c29-kube-api-access-k7kj2\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.345023 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-cert\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.345205 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwhsx\" (UniqueName: \"kubernetes.io/projected/80c4aae3-6c63-43f6-8dcb-46e953562c67-kube-api-access-mwhsx\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.345360 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-metrics-certs\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.345483 4722 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.345568 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-metrics-certs podName:80c4aae3-6c63-43f6-8dcb-46e953562c67 nodeName:}" failed. No retries permitted until 2026-02-26 20:09:20.845552983 +0000 UTC m=+903.382520927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-metrics-certs") pod "controller-86ddb6bd46-gpj96" (UID: "80c4aae3-6c63-43f6-8dcb-46e953562c67") : secret "controller-certs-secret" not found Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.347683 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.360294 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-cert\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.363363 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwhsx\" (UniqueName: \"kubernetes.io/projected/80c4aae3-6c63-43f6-8dcb-46e953562c67-kube-api-access-mwhsx\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.368349 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.648431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee913a7-6a3f-46e5-99f8-d405722ef55e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.655854 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee913a7-6a3f-46e5-99f8-d405722ef55e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.750268 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.750466 4722 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.750556 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist podName:de675145-f60b-4c0c-b5c9-ef0b33e10c29 nodeName:}" failed. No retries permitted until 2026-02-26 20:09:21.750533803 +0000 UTC m=+904.287501737 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist") pod "speaker-q9jh2" (UID: "de675145-f60b-4c0c-b5c9-ef0b33e10c29") : secret "metallb-memberlist" not found Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.835953 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.851636 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-metrics-certs\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.854856 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-metrics-certs\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.890352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"424f94dad5212a10d5f3980733942176efce7d0b1aff1d88488fef443a25fb91"} Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.958170 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.045013 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7"] Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.152829 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-gpj96"] Feb 26 20:09:21 crc kubenswrapper[4722]: W0226 20:09:21.157540 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80c4aae3_6c63_43f6_8dcb_46e953562c67.slice/crio-a5cd7dc28b3af3658c93038317154be71ab5ae1f1f94173a80894fbf7866b998 WatchSource:0}: Error finding container a5cd7dc28b3af3658c93038317154be71ab5ae1f1f94173a80894fbf7866b998: Status 404 returned error can't find the container with id a5cd7dc28b3af3658c93038317154be71ab5ae1f1f94173a80894fbf7866b998 Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.760546 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.779992 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.821356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-q9jh2" Feb 26 20:09:21 crc kubenswrapper[4722]: W0226 20:09:21.849235 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde675145_f60b_4c0c_b5c9_ef0b33e10c29.slice/crio-4dda9ab33ab45006e267f33a463f3f76d03f34a42932bbe737b121c561f91032 WatchSource:0}: Error finding container 4dda9ab33ab45006e267f33a463f3f76d03f34a42932bbe737b121c561f91032: Status 404 returned error can't find the container with id 4dda9ab33ab45006e267f33a463f3f76d03f34a42932bbe737b121c561f91032 Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.898093 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" event={"ID":"0ee913a7-6a3f-46e5-99f8-d405722ef55e","Type":"ContainerStarted","Data":"c4ae07dcd81bf5ab0bfd5ec798447427172b509268f6fefaba61044a72537cde"} Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.899192 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q9jh2" event={"ID":"de675145-f60b-4c0c-b5c9-ef0b33e10c29","Type":"ContainerStarted","Data":"4dda9ab33ab45006e267f33a463f3f76d03f34a42932bbe737b121c561f91032"} Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.901965 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-gpj96" event={"ID":"80c4aae3-6c63-43f6-8dcb-46e953562c67","Type":"ContainerStarted","Data":"b2e0225cf570b8a1f43ecb4e6777f7fe56f60bafba5033d74ed7f53cffd1a802"} Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.901991 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-gpj96" event={"ID":"80c4aae3-6c63-43f6-8dcb-46e953562c67","Type":"ContainerStarted","Data":"e3446b3a2d6a163c1ff4ce9f7c3c3b91d667a55579637bf06033f91fa2125e6f"} Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.902000 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-gpj96" event={"ID":"80c4aae3-6c63-43f6-8dcb-46e953562c67","Type":"ContainerStarted","Data":"a5cd7dc28b3af3658c93038317154be71ab5ae1f1f94173a80894fbf7866b998"} Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.902729 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.930792 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-gpj96" podStartSLOduration=1.930765547 podStartE2EDuration="1.930765547s" podCreationTimestamp="2026-02-26 20:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:09:21.920507501 +0000 UTC m=+904.457475425" watchObservedRunningTime="2026-02-26 20:09:21.930765547 +0000 UTC m=+904.467733481" Feb 26 20:09:22 crc kubenswrapper[4722]: I0226 20:09:22.909316 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q9jh2" event={"ID":"de675145-f60b-4c0c-b5c9-ef0b33e10c29","Type":"ContainerStarted","Data":"d272fc7011be63f2d34fd5dd853f72990e7eaef6f6d509e0758bb64fb88d53ec"} Feb 26 20:09:22 crc kubenswrapper[4722]: I0226 20:09:22.909666 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q9jh2" event={"ID":"de675145-f60b-4c0c-b5c9-ef0b33e10c29","Type":"ContainerStarted","Data":"7cd8e002e94fc2503a4d0e4d9301e74e3ef61d814345f72db75f31a4ef23326f"} Feb 26 20:09:22 crc kubenswrapper[4722]: I0226 20:09:22.927196 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-q9jh2" podStartSLOduration=3.927177619 podStartE2EDuration="3.927177619s" podCreationTimestamp="2026-02-26 20:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:09:22.92421788 +0000 UTC m=+905.461185824" watchObservedRunningTime="2026-02-26 20:09:22.927177619 +0000 UTC m=+905.464145563" Feb 26 20:09:23 crc kubenswrapper[4722]: I0226 20:09:23.915507 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-q9jh2" Feb 26 20:09:28 crc kubenswrapper[4722]: I0226 20:09:28.976290 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" event={"ID":"0ee913a7-6a3f-46e5-99f8-d405722ef55e","Type":"ContainerStarted","Data":"fdc590b300cd8af71c0d834a73ac03a29614cd05fc883e8514cb08b3f48d3022"} Feb 26 20:09:28 crc kubenswrapper[4722]: I0226 20:09:28.976874 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:28 crc kubenswrapper[4722]: I0226 20:09:28.977706 4722 generic.go:334] "Generic (PLEG): container finished" podID="0a425713-23b7-4347-96b0-c4736712d0ab" containerID="dd84034c13822326203d6507e6fa80dfd86b04e7eef530fa0461cfc657c4c262" exitCode=0 Feb 26 20:09:28 crc kubenswrapper[4722]: I0226 20:09:28.977739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerDied","Data":"dd84034c13822326203d6507e6fa80dfd86b04e7eef530fa0461cfc657c4c262"} Feb 26 20:09:28 crc kubenswrapper[4722]: I0226 20:09:28.994892 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" podStartSLOduration=2.778936905 podStartE2EDuration="9.99487316s" podCreationTimestamp="2026-02-26 20:09:19 +0000 UTC" firstStartedPulling="2026-02-26 20:09:21.052261645 +0000 UTC m=+903.589229569" lastFinishedPulling="2026-02-26 20:09:28.2681979 +0000 UTC m=+910.805165824" observedRunningTime="2026-02-26 20:09:28.990960864 +0000 UTC m=+911.527928798" watchObservedRunningTime="2026-02-26 20:09:28.99487316 +0000 UTC m=+911.531841094" Feb 26 20:09:29 crc kubenswrapper[4722]: I0226 20:09:29.984700 4722 generic.go:334] "Generic (PLEG): container finished" podID="0a425713-23b7-4347-96b0-c4736712d0ab" containerID="e80228fd0185731e46886e8d3ff462549a84ccb814d0a9fd7720e44c050cbbf4" exitCode=0 Feb 26 20:09:29 crc kubenswrapper[4722]: I0226 20:09:29.984748 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerDied","Data":"e80228fd0185731e46886e8d3ff462549a84ccb814d0a9fd7720e44c050cbbf4"} Feb 26 20:09:30 crc kubenswrapper[4722]: I0226 20:09:30.995516 4722 generic.go:334] "Generic (PLEG): container finished" podID="0a425713-23b7-4347-96b0-c4736712d0ab" containerID="dca5b008aa043284c2f68b604be7be8acaacda0a4dec1a31d8c72fe6f21d8e7f" exitCode=0 Feb 26 20:09:30 crc kubenswrapper[4722]: I0226 20:09:30.995689 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerDied","Data":"dca5b008aa043284c2f68b604be7be8acaacda0a4dec1a31d8c72fe6f21d8e7f"} Feb 26 20:09:32 crc kubenswrapper[4722]: I0226 20:09:32.004926 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"c9b73f0564d605bc298ffbdf33276bb30243c5364b80214644fadeadef5d78d3"} Feb 26 20:09:32 crc kubenswrapper[4722]: I0226 20:09:32.005632 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"c88d82618c38d14aae83544cdf0ccc8274399e204fdeb08f1a473f0a1144309b"} Feb 26 20:09:32 crc kubenswrapper[4722]: I0226 20:09:32.005724 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"398f3d86bf3af0e86a49eff493f30a0c735a04db5e2bebb242aac18a3ebfc635"} Feb 26 20:09:32 crc kubenswrapper[4722]: I0226 20:09:32.005787 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"dc12f97fa6e8cb27f253327fea1e80d0d21c7bfcb557288d6a6aebe0b5fe0e18"} Feb 26 20:09:33 crc kubenswrapper[4722]: I0226 20:09:33.015660 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"dd3691e731a50a342633d4e25a466344f6c5c93b3bd233b5bdab17e1f58b0113"} Feb 26 20:09:34 crc kubenswrapper[4722]: I0226 20:09:34.024894 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"ef7d22c858d2b9a03d2835732b8633c057d54fab7d2a0425ab400b65b68c33f5"} Feb 26 20:09:34 crc kubenswrapper[4722]: I0226 20:09:34.025203 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:34 crc kubenswrapper[4722]: I0226 20:09:34.045678 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-l46cn" podStartSLOduration=7.170480517 podStartE2EDuration="15.045642387s" podCreationTimestamp="2026-02-26 20:09:19 +0000 UTC" firstStartedPulling="2026-02-26 20:09:20.367970096 +0000 UTC m=+902.904938030" lastFinishedPulling="2026-02-26 20:09:28.243131976 +0000 UTC m=+910.780099900" observedRunningTime="2026-02-26 20:09:34.0446524 +0000 UTC m=+916.581620334" watchObservedRunningTime="2026-02-26 20:09:34.045642387 +0000 UTC m=+916.582610331" Feb 26 20:09:35 crc kubenswrapper[4722]: I0226 20:09:35.243779 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:35 crc kubenswrapper[4722]: I0226 20:09:35.282826 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:40 crc kubenswrapper[4722]: I0226 20:09:40.840157 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:40 crc kubenswrapper[4722]: I0226 20:09:40.966026 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:41 crc kubenswrapper[4722]: I0226 20:09:41.824651 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-q9jh2" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.577950 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mrs9q"] Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.579231 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.581786 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.582516 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-47mgw" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.583213 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.610981 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mrs9q"] Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.702609 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct6nk\" (UniqueName: \"kubernetes.io/projected/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4-kube-api-access-ct6nk\") pod \"openstack-operator-index-mrs9q\" (UID: \"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4\") " pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.803963 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct6nk\" (UniqueName: \"kubernetes.io/projected/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4-kube-api-access-ct6nk\") pod \"openstack-operator-index-mrs9q\" (UID: \"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4\") " pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.824759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct6nk\" (UniqueName: \"kubernetes.io/projected/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4-kube-api-access-ct6nk\") pod \"openstack-operator-index-mrs9q\" (UID: \"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4\") " pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.903830 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:45 crc kubenswrapper[4722]: I0226 20:09:45.333248 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mrs9q"] Feb 26 20:09:46 crc kubenswrapper[4722]: I0226 20:09:46.100974 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mrs9q" event={"ID":"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4","Type":"ContainerStarted","Data":"f46c8c8b3d90e53050041918ba68ca6e056dbca2a614f082ccf8f74e6c34d1cd"} Feb 26 20:09:47 crc kubenswrapper[4722]: I0226 20:09:47.958546 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mrs9q"] Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.114319 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mrs9q" event={"ID":"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4","Type":"ContainerStarted","Data":"8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853"} Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.134803 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mrs9q" podStartSLOduration=1.894467304 podStartE2EDuration="4.134785822s" podCreationTimestamp="2026-02-26 20:09:44 +0000 UTC" firstStartedPulling="2026-02-26 20:09:45.360153117 +0000 UTC m=+927.897121041" lastFinishedPulling="2026-02-26 20:09:47.600471635 +0000 UTC m=+930.137439559" observedRunningTime="2026-02-26 20:09:48.132032788 +0000 UTC m=+930.669000732" watchObservedRunningTime="2026-02-26 20:09:48.134785822 +0000 UTC m=+930.671753746" Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.569883 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-f7qpg"] Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.571102 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.572293 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxn67\" (UniqueName: \"kubernetes.io/projected/73eb4662-b5c2-4bad-a2ee-6bfbe704e239-kube-api-access-jxn67\") pod \"openstack-operator-index-f7qpg\" (UID: \"73eb4662-b5c2-4bad-a2ee-6bfbe704e239\") " pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.576420 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f7qpg"] Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.673322 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxn67\" (UniqueName: \"kubernetes.io/projected/73eb4662-b5c2-4bad-a2ee-6bfbe704e239-kube-api-access-jxn67\") pod \"openstack-operator-index-f7qpg\" (UID: \"73eb4662-b5c2-4bad-a2ee-6bfbe704e239\") " pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.690848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxn67\" (UniqueName: \"kubernetes.io/projected/73eb4662-b5c2-4bad-a2ee-6bfbe704e239-kube-api-access-jxn67\") pod \"openstack-operator-index-f7qpg\" (UID: \"73eb4662-b5c2-4bad-a2ee-6bfbe704e239\") " pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.887526 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:49 crc kubenswrapper[4722]: I0226 20:09:49.120029 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-mrs9q" podUID="cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" containerName="registry-server" containerID="cri-o://8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853" gracePeriod=2 Feb 26 20:09:49 crc kubenswrapper[4722]: I0226 20:09:49.285785 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f7qpg"] Feb 26 20:09:49 crc kubenswrapper[4722]: I0226 20:09:49.491828 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:49 crc kubenswrapper[4722]: I0226 20:09:49.585656 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct6nk\" (UniqueName: \"kubernetes.io/projected/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4-kube-api-access-ct6nk\") pod \"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4\" (UID: \"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4\") " Feb 26 20:09:49 crc kubenswrapper[4722]: I0226 20:09:49.590915 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4-kube-api-access-ct6nk" (OuterVolumeSpecName: "kube-api-access-ct6nk") pod "cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" (UID: "cae9421f-3b70-4ef9-9ee3-6d5977c96fa4"). InnerVolumeSpecName "kube-api-access-ct6nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:09:49 crc kubenswrapper[4722]: I0226 20:09:49.686883 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct6nk\" (UniqueName: \"kubernetes.io/projected/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4-kube-api-access-ct6nk\") on node \"crc\" DevicePath \"\"" Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.129118 4722 generic.go:334] "Generic (PLEG): container finished" podID="cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" containerID="8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853" exitCode=0 Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.129172 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.129211 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mrs9q" event={"ID":"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4","Type":"ContainerDied","Data":"8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853"} Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.129238 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mrs9q" event={"ID":"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4","Type":"ContainerDied","Data":"f46c8c8b3d90e53050041918ba68ca6e056dbca2a614f082ccf8f74e6c34d1cd"} Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.129253 4722 scope.go:117] "RemoveContainer" containerID="8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853" Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.132626 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f7qpg" event={"ID":"73eb4662-b5c2-4bad-a2ee-6bfbe704e239","Type":"ContainerStarted","Data":"2eef7ee7790e63546a2e590582c26c3c7b92cd625f9a38f356406a8304a2f8cb"} Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.132665 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f7qpg" event={"ID":"73eb4662-b5c2-4bad-a2ee-6bfbe704e239","Type":"ContainerStarted","Data":"3973e64074c25377ecb8e281a6c0e20be82201ab5a47e710cf39dd2a7b88a0c3"} Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.149667 4722 scope.go:117] "RemoveContainer" containerID="8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853" Feb 26 20:09:50 crc kubenswrapper[4722]: E0226 20:09:50.150075 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853\": container with ID starting with 8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853 not found: ID does not exist" containerID="8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853" Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.150156 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853"} err="failed to get container status \"8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853\": rpc error: code = NotFound desc = could not find container \"8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853\": container with ID starting with 8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853 not found: ID does not exist" Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.157578 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-f7qpg" podStartSLOduration=2.099864569 podStartE2EDuration="2.157555991s" podCreationTimestamp="2026-02-26 20:09:48 +0000 UTC" firstStartedPulling="2026-02-26 20:09:49.314244116 +0000 UTC m=+931.851212060" lastFinishedPulling="2026-02-26 20:09:49.371935528 +0000 UTC m=+931.908903482" observedRunningTime="2026-02-26 20:09:50.15157098 +0000 UTC m=+932.688538954" watchObservedRunningTime="2026-02-26 20:09:50.157555991 +0000 UTC m=+932.694523925" Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.167991 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mrs9q"] Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.172007 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-mrs9q"] Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.246026 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:52 crc kubenswrapper[4722]: I0226 20:09:52.154066 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" path="/var/lib/kubelet/pods/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4/volumes" Feb 26 20:09:53 crc kubenswrapper[4722]: I0226 20:09:53.487185 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:09:53 crc kubenswrapper[4722]: I0226 20:09:53.487853 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:09:58 crc kubenswrapper[4722]: I0226 20:09:58.888109 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:58 crc kubenswrapper[4722]: I0226 20:09:58.888438 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:58 crc kubenswrapper[4722]: I0226 20:09:58.922961 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:59 crc kubenswrapper[4722]: I0226 20:09:59.232915 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.137266 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535610-5gtlr"] Feb 26 20:10:00 crc kubenswrapper[4722]: E0226 20:10:00.139732 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" containerName="registry-server" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.140311 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" containerName="registry-server" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.140835 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" containerName="registry-server" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.142272 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535610-5gtlr"] Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.142370 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.147320 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.147616 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.147760 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.229802 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh96k\" (UniqueName: \"kubernetes.io/projected/7d4066f0-78d5-4810-9b52-358ed4e1efbd-kube-api-access-sh96k\") pod \"auto-csr-approver-29535610-5gtlr\" (UID: \"7d4066f0-78d5-4810-9b52-358ed4e1efbd\") " pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.331170 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh96k\" (UniqueName: \"kubernetes.io/projected/7d4066f0-78d5-4810-9b52-358ed4e1efbd-kube-api-access-sh96k\") pod \"auto-csr-approver-29535610-5gtlr\" (UID: \"7d4066f0-78d5-4810-9b52-358ed4e1efbd\") " pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.355100 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh96k\" (UniqueName: \"kubernetes.io/projected/7d4066f0-78d5-4810-9b52-358ed4e1efbd-kube-api-access-sh96k\") pod \"auto-csr-approver-29535610-5gtlr\" (UID: \"7d4066f0-78d5-4810-9b52-358ed4e1efbd\") " pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.474676 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.893441 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535610-5gtlr"] Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.032038 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn"] Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.034751 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.036995 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-p7zpm" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.039567 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn"] Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.055922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/6c0b5d69-915c-419e-89e6-9600523f5284-kube-api-access-k56q4\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.055985 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-util\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.056099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-bundle\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.157150 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-bundle\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.157223 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/6c0b5d69-915c-419e-89e6-9600523f5284-kube-api-access-k56q4\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.157279 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-util\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.157776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-bundle\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.157903 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-util\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.182879 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/6c0b5d69-915c-419e-89e6-9600523f5284-kube-api-access-k56q4\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.203216 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" event={"ID":"7d4066f0-78d5-4810-9b52-358ed4e1efbd","Type":"ContainerStarted","Data":"d367cac818fd862385324dfbbb6f68fd56e49a8f4fb46183ff3d095921b1a247"} Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.351447 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.750350 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn"] Feb 26 20:10:01 crc kubenswrapper[4722]: W0226 20:10:01.758758 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c0b5d69_915c_419e_89e6_9600523f5284.slice/crio-41a91919a5f19ac612e37cdfd7095ff3a819ef3632b8acc9a911d06d64133ea2 WatchSource:0}: Error finding container 41a91919a5f19ac612e37cdfd7095ff3a819ef3632b8acc9a911d06d64133ea2: Status 404 returned error can't find the container with id 41a91919a5f19ac612e37cdfd7095ff3a819ef3632b8acc9a911d06d64133ea2 Feb 26 20:10:02 crc kubenswrapper[4722]: I0226 20:10:02.211268 4722 generic.go:334] "Generic (PLEG): container finished" podID="6c0b5d69-915c-419e-89e6-9600523f5284" containerID="ba242fa345ecbaca056ee15e7290a61592c7f8023fece725e3454514989e3683" exitCode=0 Feb 26 20:10:02 crc kubenswrapper[4722]: I0226 20:10:02.211333 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" event={"ID":"6c0b5d69-915c-419e-89e6-9600523f5284","Type":"ContainerDied","Data":"ba242fa345ecbaca056ee15e7290a61592c7f8023fece725e3454514989e3683"} Feb 26 20:10:02 crc kubenswrapper[4722]: I0226 20:10:02.211639 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" event={"ID":"6c0b5d69-915c-419e-89e6-9600523f5284","Type":"ContainerStarted","Data":"41a91919a5f19ac612e37cdfd7095ff3a819ef3632b8acc9a911d06d64133ea2"} Feb 26 20:10:02 crc kubenswrapper[4722]: I0226 20:10:02.215039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" event={"ID":"7d4066f0-78d5-4810-9b52-358ed4e1efbd","Type":"ContainerStarted","Data":"729c7c263fc0eb65734a08b008dc42681c8138323ad073790b9d94370f759560"} Feb 26 20:10:02 crc kubenswrapper[4722]: I0226 20:10:02.242264 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" podStartSLOduration=1.223845676 podStartE2EDuration="2.24224502s" podCreationTimestamp="2026-02-26 20:10:00 +0000 UTC" firstStartedPulling="2026-02-26 20:10:00.910598144 +0000 UTC m=+943.447566108" lastFinishedPulling="2026-02-26 20:10:01.928997528 +0000 UTC m=+944.465965452" observedRunningTime="2026-02-26 20:10:02.242191189 +0000 UTC m=+944.779159133" watchObservedRunningTime="2026-02-26 20:10:02.24224502 +0000 UTC m=+944.779212964" Feb 26 20:10:03 crc kubenswrapper[4722]: I0226 20:10:03.223356 4722 generic.go:334] "Generic (PLEG): container finished" podID="6c0b5d69-915c-419e-89e6-9600523f5284" containerID="93c1d530273d75bec071aa1b2d6d801e71b2f22452de425e29a559b2803bddec" exitCode=0 Feb 26 20:10:03 crc kubenswrapper[4722]: I0226 20:10:03.223415 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" event={"ID":"6c0b5d69-915c-419e-89e6-9600523f5284","Type":"ContainerDied","Data":"93c1d530273d75bec071aa1b2d6d801e71b2f22452de425e29a559b2803bddec"} Feb 26 20:10:03 crc kubenswrapper[4722]: I0226 20:10:03.225923 4722 generic.go:334] "Generic (PLEG): container finished" podID="7d4066f0-78d5-4810-9b52-358ed4e1efbd" containerID="729c7c263fc0eb65734a08b008dc42681c8138323ad073790b9d94370f759560" exitCode=0 Feb 26 20:10:03 crc kubenswrapper[4722]: I0226 20:10:03.225959 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" event={"ID":"7d4066f0-78d5-4810-9b52-358ed4e1efbd","Type":"ContainerDied","Data":"729c7c263fc0eb65734a08b008dc42681c8138323ad073790b9d94370f759560"} Feb 26 20:10:04 crc kubenswrapper[4722]: I0226 20:10:04.237889 4722 generic.go:334] "Generic (PLEG): container finished" podID="6c0b5d69-915c-419e-89e6-9600523f5284" containerID="4b47c9e5fc77d7ab07d766e7dec92cd59071de6811a31329dedce70e3408a721" exitCode=0 Feb 26 20:10:04 crc kubenswrapper[4722]: I0226 20:10:04.237946 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" event={"ID":"6c0b5d69-915c-419e-89e6-9600523f5284","Type":"ContainerDied","Data":"4b47c9e5fc77d7ab07d766e7dec92cd59071de6811a31329dedce70e3408a721"} Feb 26 20:10:04 crc kubenswrapper[4722]: I0226 20:10:04.533572 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:04 crc kubenswrapper[4722]: I0226 20:10:04.704004 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh96k\" (UniqueName: \"kubernetes.io/projected/7d4066f0-78d5-4810-9b52-358ed4e1efbd-kube-api-access-sh96k\") pod \"7d4066f0-78d5-4810-9b52-358ed4e1efbd\" (UID: \"7d4066f0-78d5-4810-9b52-358ed4e1efbd\") " Feb 26 20:10:04 crc kubenswrapper[4722]: I0226 20:10:04.711030 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d4066f0-78d5-4810-9b52-358ed4e1efbd-kube-api-access-sh96k" (OuterVolumeSpecName: "kube-api-access-sh96k") pod "7d4066f0-78d5-4810-9b52-358ed4e1efbd" (UID: "7d4066f0-78d5-4810-9b52-358ed4e1efbd"). InnerVolumeSpecName "kube-api-access-sh96k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:10:04 crc kubenswrapper[4722]: I0226 20:10:04.805419 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh96k\" (UniqueName: \"kubernetes.io/projected/7d4066f0-78d5-4810-9b52-358ed4e1efbd-kube-api-access-sh96k\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.249509 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" event={"ID":"7d4066f0-78d5-4810-9b52-358ed4e1efbd","Type":"ContainerDied","Data":"d367cac818fd862385324dfbbb6f68fd56e49a8f4fb46183ff3d095921b1a247"} Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.251171 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d367cac818fd862385324dfbbb6f68fd56e49a8f4fb46183ff3d095921b1a247" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.249548 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.322462 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535604-xtrhk"] Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.326809 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535604-xtrhk"] Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.524387 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.730366 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/6c0b5d69-915c-419e-89e6-9600523f5284-kube-api-access-k56q4\") pod \"6c0b5d69-915c-419e-89e6-9600523f5284\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.730550 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-util\") pod \"6c0b5d69-915c-419e-89e6-9600523f5284\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.730857 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-bundle\") pod \"6c0b5d69-915c-419e-89e6-9600523f5284\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.731543 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-bundle" (OuterVolumeSpecName: "bundle") pod "6c0b5d69-915c-419e-89e6-9600523f5284" (UID: "6c0b5d69-915c-419e-89e6-9600523f5284"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.750380 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c0b5d69-915c-419e-89e6-9600523f5284-kube-api-access-k56q4" (OuterVolumeSpecName: "kube-api-access-k56q4") pod "6c0b5d69-915c-419e-89e6-9600523f5284" (UID: "6c0b5d69-915c-419e-89e6-9600523f5284"). InnerVolumeSpecName "kube-api-access-k56q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.755812 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-util" (OuterVolumeSpecName: "util") pod "6c0b5d69-915c-419e-89e6-9600523f5284" (UID: "6c0b5d69-915c-419e-89e6-9600523f5284"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.831923 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.831966 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/6c0b5d69-915c-419e-89e6-9600523f5284-kube-api-access-k56q4\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.831979 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-util\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:06 crc kubenswrapper[4722]: I0226 20:10:06.154262 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a0b333-4923-4483-b110-ea7109c80c67" path="/var/lib/kubelet/pods/c1a0b333-4923-4483-b110-ea7109c80c67/volumes" Feb 26 20:10:06 crc kubenswrapper[4722]: I0226 20:10:06.263881 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" event={"ID":"6c0b5d69-915c-419e-89e6-9600523f5284","Type":"ContainerDied","Data":"41a91919a5f19ac612e37cdfd7095ff3a819ef3632b8acc9a911d06d64133ea2"} Feb 26 20:10:06 crc kubenswrapper[4722]: I0226 20:10:06.263931 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41a91919a5f19ac612e37cdfd7095ff3a819ef3632b8acc9a911d06d64133ea2" Feb 26 20:10:06 crc kubenswrapper[4722]: I0226 20:10:06.263970 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.049380 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc"] Feb 26 20:10:08 crc kubenswrapper[4722]: E0226 20:10:08.050066 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="extract" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050084 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="extract" Feb 26 20:10:08 crc kubenswrapper[4722]: E0226 20:10:08.050097 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="util" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050107 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="util" Feb 26 20:10:08 crc kubenswrapper[4722]: E0226 20:10:08.050127 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4066f0-78d5-4810-9b52-358ed4e1efbd" containerName="oc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050162 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4066f0-78d5-4810-9b52-358ed4e1efbd" containerName="oc" Feb 26 20:10:08 crc kubenswrapper[4722]: E0226 20:10:08.050173 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="pull" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050182 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="pull" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050342 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4066f0-78d5-4810-9b52-358ed4e1efbd" containerName="oc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050354 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="extract" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050873 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.056946 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-x4fpd" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.060432 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzgjc\" (UniqueName: \"kubernetes.io/projected/47a13091-6ef0-488e-98aa-beb72bc48ce6-kube-api-access-tzgjc\") pod \"openstack-operator-controller-init-5bd4858f4d-4spcc\" (UID: \"47a13091-6ef0-488e-98aa-beb72bc48ce6\") " pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.110517 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc"] Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.161645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzgjc\" (UniqueName: \"kubernetes.io/projected/47a13091-6ef0-488e-98aa-beb72bc48ce6-kube-api-access-tzgjc\") pod \"openstack-operator-controller-init-5bd4858f4d-4spcc\" (UID: \"47a13091-6ef0-488e-98aa-beb72bc48ce6\") " pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.196445 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzgjc\" (UniqueName: \"kubernetes.io/projected/47a13091-6ef0-488e-98aa-beb72bc48ce6-kube-api-access-tzgjc\") pod \"openstack-operator-controller-init-5bd4858f4d-4spcc\" (UID: \"47a13091-6ef0-488e-98aa-beb72bc48ce6\") " pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.367042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.788504 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc"] Feb 26 20:10:09 crc kubenswrapper[4722]: I0226 20:10:09.282097 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" event={"ID":"47a13091-6ef0-488e-98aa-beb72bc48ce6","Type":"ContainerStarted","Data":"1d3ab3694edf59193de5bf53a91fc4eb776b0e5bd542a13f836a3cd0a507a29d"} Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.321183 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" event={"ID":"47a13091-6ef0-488e-98aa-beb72bc48ce6","Type":"ContainerStarted","Data":"b80c7a0f381f80c207aedeefbc83d2c669fdb68ec9f3dd97b0f7d9e93dc002df"} Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.321789 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.355521 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" podStartSLOduration=1.2790071379999999 podStartE2EDuration="6.355492797s" podCreationTimestamp="2026-02-26 20:10:08 +0000 UTC" firstStartedPulling="2026-02-26 20:10:08.803615565 +0000 UTC m=+951.340583489" lastFinishedPulling="2026-02-26 20:10:13.880101224 +0000 UTC m=+956.417069148" observedRunningTime="2026-02-26 20:10:14.348649122 +0000 UTC m=+956.885617096" watchObservedRunningTime="2026-02-26 20:10:14.355492797 +0000 UTC m=+956.892460761" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.772364 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hlfqb"] Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.774290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.787163 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlfqb"] Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.790991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-utilities\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.791157 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qldxz\" (UniqueName: \"kubernetes.io/projected/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-kube-api-access-qldxz\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.791204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-catalog-content\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.892043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-utilities\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.892180 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qldxz\" (UniqueName: \"kubernetes.io/projected/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-kube-api-access-qldxz\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.892216 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-catalog-content\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.892741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-catalog-content\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.893025 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-utilities\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.915303 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qldxz\" (UniqueName: \"kubernetes.io/projected/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-kube-api-access-qldxz\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:15 crc kubenswrapper[4722]: I0226 20:10:15.092780 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:15 crc kubenswrapper[4722]: I0226 20:10:15.347049 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlfqb"] Feb 26 20:10:15 crc kubenswrapper[4722]: W0226 20:10:15.352578 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51650a1c_a5f4_4e25_88dd_50f6cfdb1675.slice/crio-d5eb8d3f35ede073dc50ebd1e1f67af059452ae144b612f622e71fe034569726 WatchSource:0}: Error finding container d5eb8d3f35ede073dc50ebd1e1f67af059452ae144b612f622e71fe034569726: Status 404 returned error can't find the container with id d5eb8d3f35ede073dc50ebd1e1f67af059452ae144b612f622e71fe034569726 Feb 26 20:10:16 crc kubenswrapper[4722]: I0226 20:10:16.339827 4722 generic.go:334] "Generic (PLEG): container finished" podID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerID="e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9" exitCode=0 Feb 26 20:10:16 crc kubenswrapper[4722]: I0226 20:10:16.339884 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerDied","Data":"e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9"} Feb 26 20:10:16 crc kubenswrapper[4722]: I0226 20:10:16.339917 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerStarted","Data":"d5eb8d3f35ede073dc50ebd1e1f67af059452ae144b612f622e71fe034569726"} Feb 26 20:10:17 crc kubenswrapper[4722]: I0226 20:10:17.348528 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerStarted","Data":"2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700"} Feb 26 20:10:18 crc kubenswrapper[4722]: I0226 20:10:18.356641 4722 generic.go:334] "Generic (PLEG): container finished" podID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerID="2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700" exitCode=0 Feb 26 20:10:18 crc kubenswrapper[4722]: I0226 20:10:18.356744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerDied","Data":"2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700"} Feb 26 20:10:18 crc kubenswrapper[4722]: I0226 20:10:18.804888 4722 scope.go:117] "RemoveContainer" containerID="45dcb0f1668265fe8e719cd4acb2ecb42b8c96958fcf0c875af8011f92fb6974" Feb 26 20:10:18 crc kubenswrapper[4722]: I0226 20:10:18.964898 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vzhk2"] Feb 26 20:10:18 crc kubenswrapper[4722]: I0226 20:10:18.966344 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:18 crc kubenswrapper[4722]: I0226 20:10:18.978864 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzhk2"] Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.049782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-catalog-content\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.049840 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqkzr\" (UniqueName: \"kubernetes.io/projected/16a6406b-4cf1-4e69-b609-d3d91506ef5a-kube-api-access-lqkzr\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.049862 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-utilities\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.151097 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqkzr\" (UniqueName: \"kubernetes.io/projected/16a6406b-4cf1-4e69-b609-d3d91506ef5a-kube-api-access-lqkzr\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.151189 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-utilities\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.151344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-catalog-content\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.152234 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-utilities\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.152281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-catalog-content\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.169902 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqkzr\" (UniqueName: \"kubernetes.io/projected/16a6406b-4cf1-4e69-b609-d3d91506ef5a-kube-api-access-lqkzr\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.304378 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.367086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerStarted","Data":"1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b"} Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.389165 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hlfqb" podStartSLOduration=2.9408990040000003 podStartE2EDuration="5.389127304s" podCreationTimestamp="2026-02-26 20:10:14 +0000 UTC" firstStartedPulling="2026-02-26 20:10:16.342997717 +0000 UTC m=+958.879965661" lastFinishedPulling="2026-02-26 20:10:18.791226027 +0000 UTC m=+961.328193961" observedRunningTime="2026-02-26 20:10:19.388074845 +0000 UTC m=+961.925042779" watchObservedRunningTime="2026-02-26 20:10:19.389127304 +0000 UTC m=+961.926095248" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.753943 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzhk2"] Feb 26 20:10:19 crc kubenswrapper[4722]: W0226 20:10:19.757625 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16a6406b_4cf1_4e69_b609_d3d91506ef5a.slice/crio-1c44a3088e378e93b9fe735f33b556f29f2fbb4026a7a96fb9b74e85d06921a5 WatchSource:0}: Error finding container 1c44a3088e378e93b9fe735f33b556f29f2fbb4026a7a96fb9b74e85d06921a5: Status 404 returned error can't find the container with id 1c44a3088e378e93b9fe735f33b556f29f2fbb4026a7a96fb9b74e85d06921a5 Feb 26 20:10:20 crc kubenswrapper[4722]: I0226 20:10:20.375284 4722 generic.go:334] "Generic (PLEG): container finished" podID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerID="160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26" exitCode=0 Feb 26 20:10:20 crc kubenswrapper[4722]: I0226 20:10:20.375334 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzhk2" event={"ID":"16a6406b-4cf1-4e69-b609-d3d91506ef5a","Type":"ContainerDied","Data":"160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26"} Feb 26 20:10:20 crc kubenswrapper[4722]: I0226 20:10:20.375425 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzhk2" event={"ID":"16a6406b-4cf1-4e69-b609-d3d91506ef5a","Type":"ContainerStarted","Data":"1c44a3088e378e93b9fe735f33b556f29f2fbb4026a7a96fb9b74e85d06921a5"} Feb 26 20:10:22 crc kubenswrapper[4722]: I0226 20:10:22.388548 4722 generic.go:334] "Generic (PLEG): container finished" podID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerID="adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0" exitCode=0 Feb 26 20:10:22 crc kubenswrapper[4722]: I0226 20:10:22.388614 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzhk2" event={"ID":"16a6406b-4cf1-4e69-b609-d3d91506ef5a","Type":"ContainerDied","Data":"adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0"} Feb 26 20:10:23 crc kubenswrapper[4722]: I0226 20:10:23.397765 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzhk2" event={"ID":"16a6406b-4cf1-4e69-b609-d3d91506ef5a","Type":"ContainerStarted","Data":"77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9"} Feb 26 20:10:23 crc kubenswrapper[4722]: I0226 20:10:23.423997 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vzhk2" podStartSLOduration=2.6926880349999998 podStartE2EDuration="5.423970535s" podCreationTimestamp="2026-02-26 20:10:18 +0000 UTC" firstStartedPulling="2026-02-26 20:10:20.377390446 +0000 UTC m=+962.914358370" lastFinishedPulling="2026-02-26 20:10:23.108672946 +0000 UTC m=+965.645640870" observedRunningTime="2026-02-26 20:10:23.413241326 +0000 UTC m=+965.950209260" watchObservedRunningTime="2026-02-26 20:10:23.423970535 +0000 UTC m=+965.960938489" Feb 26 20:10:23 crc kubenswrapper[4722]: I0226 20:10:23.487263 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:10:23 crc kubenswrapper[4722]: I0226 20:10:23.487309 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:10:25 crc kubenswrapper[4722]: I0226 20:10:25.093759 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:25 crc kubenswrapper[4722]: I0226 20:10:25.094416 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:25 crc kubenswrapper[4722]: I0226 20:10:25.165504 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:25 crc kubenswrapper[4722]: I0226 20:10:25.472280 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:28 crc kubenswrapper[4722]: I0226 20:10:28.359158 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlfqb"] Feb 26 20:10:28 crc kubenswrapper[4722]: I0226 20:10:28.370317 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:28 crc kubenswrapper[4722]: I0226 20:10:28.445780 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hlfqb" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="registry-server" containerID="cri-o://1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b" gracePeriod=2 Feb 26 20:10:29 crc kubenswrapper[4722]: I0226 20:10:29.304822 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:29 crc kubenswrapper[4722]: I0226 20:10:29.305384 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:29 crc kubenswrapper[4722]: I0226 20:10:29.361992 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:29 crc kubenswrapper[4722]: I0226 20:10:29.497008 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.046232 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.207076 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-catalog-content\") pod \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.207121 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qldxz\" (UniqueName: \"kubernetes.io/projected/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-kube-api-access-qldxz\") pod \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.207189 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-utilities\") pod \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.208290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-utilities" (OuterVolumeSpecName: "utilities") pod "51650a1c-a5f4-4e25-88dd-50f6cfdb1675" (UID: "51650a1c-a5f4-4e25-88dd-50f6cfdb1675"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.212411 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-kube-api-access-qldxz" (OuterVolumeSpecName: "kube-api-access-qldxz") pod "51650a1c-a5f4-4e25-88dd-50f6cfdb1675" (UID: "51650a1c-a5f4-4e25-88dd-50f6cfdb1675"). InnerVolumeSpecName "kube-api-access-qldxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.256594 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51650a1c-a5f4-4e25-88dd-50f6cfdb1675" (UID: "51650a1c-a5f4-4e25-88dd-50f6cfdb1675"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.309311 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.309342 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qldxz\" (UniqueName: \"kubernetes.io/projected/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-kube-api-access-qldxz\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.309353 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.459652 4722 generic.go:334] "Generic (PLEG): container finished" podID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerID="1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b" exitCode=0 Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.459753 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerDied","Data":"1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b"} Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.459797 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerDied","Data":"d5eb8d3f35ede073dc50ebd1e1f67af059452ae144b612f622e71fe034569726"} Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.459815 4722 scope.go:117] "RemoveContainer" containerID="1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.460873 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.485621 4722 scope.go:117] "RemoveContainer" containerID="2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.494634 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlfqb"] Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.499175 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hlfqb"] Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.526519 4722 scope.go:117] "RemoveContainer" containerID="e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.546441 4722 scope.go:117] "RemoveContainer" containerID="1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b" Feb 26 20:10:30 crc kubenswrapper[4722]: E0226 20:10:30.546843 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b\": container with ID starting with 1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b not found: ID does not exist" containerID="1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.546883 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b"} err="failed to get container status \"1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b\": rpc error: code = NotFound desc = could not find container \"1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b\": container with ID starting with 1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b not found: ID does not exist" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.546909 4722 scope.go:117] "RemoveContainer" containerID="2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700" Feb 26 20:10:30 crc kubenswrapper[4722]: E0226 20:10:30.547468 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700\": container with ID starting with 2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700 not found: ID does not exist" containerID="2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.547487 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700"} err="failed to get container status \"2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700\": rpc error: code = NotFound desc = could not find container \"2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700\": container with ID starting with 2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700 not found: ID does not exist" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.547500 4722 scope.go:117] "RemoveContainer" containerID="e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9" Feb 26 20:10:30 crc kubenswrapper[4722]: E0226 20:10:30.548119 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9\": container with ID starting with e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9 not found: ID does not exist" containerID="e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.548195 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9"} err="failed to get container status \"e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9\": rpc error: code = NotFound desc = could not find container \"e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9\": container with ID starting with e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9 not found: ID does not exist" Feb 26 20:10:32 crc kubenswrapper[4722]: I0226 20:10:32.155083 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" path="/var/lib/kubelet/pods/51650a1c-a5f4-4e25-88dd-50f6cfdb1675/volumes" Feb 26 20:10:32 crc kubenswrapper[4722]: I0226 20:10:32.958968 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzhk2"] Feb 26 20:10:32 crc kubenswrapper[4722]: I0226 20:10:32.959219 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vzhk2" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="registry-server" containerID="cri-o://77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9" gracePeriod=2 Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.472516 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.488457 4722 generic.go:334] "Generic (PLEG): container finished" podID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerID="77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9" exitCode=0 Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.488499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzhk2" event={"ID":"16a6406b-4cf1-4e69-b609-d3d91506ef5a","Type":"ContainerDied","Data":"77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9"} Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.488509 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.488523 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzhk2" event={"ID":"16a6406b-4cf1-4e69-b609-d3d91506ef5a","Type":"ContainerDied","Data":"1c44a3088e378e93b9fe735f33b556f29f2fbb4026a7a96fb9b74e85d06921a5"} Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.488541 4722 scope.go:117] "RemoveContainer" containerID="77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.520439 4722 scope.go:117] "RemoveContainer" containerID="adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.555567 4722 scope.go:117] "RemoveContainer" containerID="160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.602423 4722 scope.go:117] "RemoveContainer" containerID="77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9" Feb 26 20:10:34 crc kubenswrapper[4722]: E0226 20:10:34.611176 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9\": container with ID starting with 77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9 not found: ID does not exist" containerID="77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.611217 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9"} err="failed to get container status \"77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9\": rpc error: code = NotFound desc = could not find container \"77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9\": container with ID starting with 77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9 not found: ID does not exist" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.611245 4722 scope.go:117] "RemoveContainer" containerID="adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0" Feb 26 20:10:34 crc kubenswrapper[4722]: E0226 20:10:34.611875 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0\": container with ID starting with adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0 not found: ID does not exist" containerID="adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.611923 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0"} err="failed to get container status \"adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0\": rpc error: code = NotFound desc = could not find container \"adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0\": container with ID starting with adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0 not found: ID does not exist" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.611951 4722 scope.go:117] "RemoveContainer" containerID="160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.612176 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-catalog-content\") pod \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.612339 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-utilities\") pod \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.612400 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqkzr\" (UniqueName: \"kubernetes.io/projected/16a6406b-4cf1-4e69-b609-d3d91506ef5a-kube-api-access-lqkzr\") pod \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " Feb 26 20:10:34 crc kubenswrapper[4722]: E0226 20:10:34.612571 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26\": container with ID starting with 160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26 not found: ID does not exist" containerID="160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.612599 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26"} err="failed to get container status \"160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26\": rpc error: code = NotFound desc = could not find container \"160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26\": container with ID starting with 160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26 not found: ID does not exist" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.613242 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-utilities" (OuterVolumeSpecName: "utilities") pod "16a6406b-4cf1-4e69-b609-d3d91506ef5a" (UID: "16a6406b-4cf1-4e69-b609-d3d91506ef5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.625864 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a6406b-4cf1-4e69-b609-d3d91506ef5a-kube-api-access-lqkzr" (OuterVolumeSpecName: "kube-api-access-lqkzr") pod "16a6406b-4cf1-4e69-b609-d3d91506ef5a" (UID: "16a6406b-4cf1-4e69-b609-d3d91506ef5a"). InnerVolumeSpecName "kube-api-access-lqkzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.667436 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16a6406b-4cf1-4e69-b609-d3d91506ef5a" (UID: "16a6406b-4cf1-4e69-b609-d3d91506ef5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.713565 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.713597 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqkzr\" (UniqueName: \"kubernetes.io/projected/16a6406b-4cf1-4e69-b609-d3d91506ef5a-kube-api-access-lqkzr\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.713609 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.813073 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzhk2"] Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.818544 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vzhk2"] Feb 26 20:10:36 crc kubenswrapper[4722]: I0226 20:10:36.156584 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" path="/var/lib/kubelet/pods/16a6406b-4cf1-4e69-b609-d3d91506ef5a/volumes" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.943433 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q"] Feb 26 20:10:47 crc kubenswrapper[4722]: E0226 20:10:47.944318 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="registry-server" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944336 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="registry-server" Feb 26 20:10:47 crc kubenswrapper[4722]: E0226 20:10:47.944357 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="registry-server" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944364 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="registry-server" Feb 26 20:10:47 crc kubenswrapper[4722]: E0226 20:10:47.944376 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="extract-utilities" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944384 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="extract-utilities" Feb 26 20:10:47 crc kubenswrapper[4722]: E0226 20:10:47.944396 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="extract-utilities" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944403 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="extract-utilities" Feb 26 20:10:47 crc kubenswrapper[4722]: E0226 20:10:47.944417 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="extract-content" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944424 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="extract-content" Feb 26 20:10:47 crc kubenswrapper[4722]: E0226 20:10:47.944437 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="extract-content" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944444 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="extract-content" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944572 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="registry-server" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944593 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="registry-server" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.945130 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.950496 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jncdf" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.951367 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt"] Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.952356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.954076 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-x4np4" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.959632 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q"] Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.973371 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x"] Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.975255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.980430 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-k8blz" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.988206 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt"] Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.989372 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdvsm\" (UniqueName: \"kubernetes.io/projected/f6b9ed59-4089-4a80-bdae-368d169363f2-kube-api-access-xdvsm\") pod \"designate-operator-controller-manager-6d8bf5c495-ngk6x\" (UID: \"f6b9ed59-4089-4a80-bdae-368d169363f2\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.989425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc456\" (UniqueName: \"kubernetes.io/projected/c59c3e1b-9d18-45eb-a409-bd2176527063-kube-api-access-hc456\") pod \"cinder-operator-controller-manager-55d77d7b5c-jmhxt\" (UID: \"c59c3e1b-9d18-45eb-a409-bd2176527063\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.989455 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjjst\" (UniqueName: \"kubernetes.io/projected/71fdb02f-7fa5-4151-bec9-7e7d3ac072dd-kube-api-access-rjjst\") pod \"barbican-operator-controller-manager-868647ff47-gh42q\" (UID: \"71fdb02f-7fa5-4151-bec9-7e7d3ac072dd\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.007539 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.042372 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.043278 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.047387 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8lpqs" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.063768 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.065936 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.074675 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jr7nm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.089842 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.090642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjjst\" (UniqueName: \"kubernetes.io/projected/71fdb02f-7fa5-4151-bec9-7e7d3ac072dd-kube-api-access-rjjst\") pod \"barbican-operator-controller-manager-868647ff47-gh42q\" (UID: \"71fdb02f-7fa5-4151-bec9-7e7d3ac072dd\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.090754 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhf5h\" (UniqueName: \"kubernetes.io/projected/a2804dbe-f9c5-4aca-b3f5-6392d2bc20db-kube-api-access-jhf5h\") pod \"glance-operator-controller-manager-784b5bb6c5-nrssm\" (UID: \"a2804dbe-f9c5-4aca-b3f5-6392d2bc20db\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.090806 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvsm\" (UniqueName: \"kubernetes.io/projected/f6b9ed59-4089-4a80-bdae-368d169363f2-kube-api-access-xdvsm\") pod \"designate-operator-controller-manager-6d8bf5c495-ngk6x\" (UID: \"f6b9ed59-4089-4a80-bdae-368d169363f2\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.090871 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvbjn\" (UniqueName: \"kubernetes.io/projected/604550ce-766e-48bb-a0a7-d14b7708a44e-kube-api-access-qvbjn\") pod \"heat-operator-controller-manager-69f49c598c-hw5f9\" (UID: \"604550ce-766e-48bb-a0a7-d14b7708a44e\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.090897 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc456\" (UniqueName: \"kubernetes.io/projected/c59c3e1b-9d18-45eb-a409-bd2176527063-kube-api-access-hc456\") pod \"cinder-operator-controller-manager-55d77d7b5c-jmhxt\" (UID: \"c59c3e1b-9d18-45eb-a409-bd2176527063\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.114640 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.115867 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.117478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjjst\" (UniqueName: \"kubernetes.io/projected/71fdb02f-7fa5-4151-bec9-7e7d3ac072dd-kube-api-access-rjjst\") pod \"barbican-operator-controller-manager-868647ff47-gh42q\" (UID: \"71fdb02f-7fa5-4151-bec9-7e7d3ac072dd\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.118640 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc456\" (UniqueName: \"kubernetes.io/projected/c59c3e1b-9d18-45eb-a409-bd2176527063-kube-api-access-hc456\") pod \"cinder-operator-controller-manager-55d77d7b5c-jmhxt\" (UID: \"c59c3e1b-9d18-45eb-a409-bd2176527063\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.127263 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-vl7ps" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.140508 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdvsm\" (UniqueName: \"kubernetes.io/projected/f6b9ed59-4089-4a80-bdae-368d169363f2-kube-api-access-xdvsm\") pod \"designate-operator-controller-manager-6d8bf5c495-ngk6x\" (UID: \"f6b9ed59-4089-4a80-bdae-368d169363f2\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.140573 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.143238 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.162966 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.164481 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.164918 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.166218 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.166505 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-tjl4b" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.192387 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.193717 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75r2k\" (UniqueName: \"kubernetes.io/projected/109ec0d2-04bf-4476-b14c-51249361da38-kube-api-access-75r2k\") pod \"horizon-operator-controller-manager-5b9b8895d5-dzzdm\" (UID: \"109ec0d2-04bf-4476-b14c-51249361da38\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.194716 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhf5h\" (UniqueName: \"kubernetes.io/projected/a2804dbe-f9c5-4aca-b3f5-6392d2bc20db-kube-api-access-jhf5h\") pod \"glance-operator-controller-manager-784b5bb6c5-nrssm\" (UID: \"a2804dbe-f9c5-4aca-b3f5-6392d2bc20db\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.194815 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvbjn\" (UniqueName: \"kubernetes.io/projected/604550ce-766e-48bb-a0a7-d14b7708a44e-kube-api-access-qvbjn\") pod \"heat-operator-controller-manager-69f49c598c-hw5f9\" (UID: \"604550ce-766e-48bb-a0a7-d14b7708a44e\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.194884 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7vnx\" (UniqueName: \"kubernetes.io/projected/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-kube-api-access-b7vnx\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.194911 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.195362 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.205190 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-dmclx" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.213986 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.219667 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvbjn\" (UniqueName: \"kubernetes.io/projected/604550ce-766e-48bb-a0a7-d14b7708a44e-kube-api-access-qvbjn\") pod \"heat-operator-controller-manager-69f49c598c-hw5f9\" (UID: \"604550ce-766e-48bb-a0a7-d14b7708a44e\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.221743 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhf5h\" (UniqueName: \"kubernetes.io/projected/a2804dbe-f9c5-4aca-b3f5-6392d2bc20db-kube-api-access-jhf5h\") pod \"glance-operator-controller-manager-784b5bb6c5-nrssm\" (UID: \"a2804dbe-f9c5-4aca-b3f5-6392d2bc20db\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.227835 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.228610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.230030 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-jmph7" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.231314 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.231906 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.233532 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-cjbbz" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.252825 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.253605 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.255995 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-8ltgk" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.264631 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.266834 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.267633 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.270819 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-x8t5n" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.283326 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.295354 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297022 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75r2k\" (UniqueName: \"kubernetes.io/projected/109ec0d2-04bf-4476-b14c-51249361da38-kube-api-access-75r2k\") pod \"horizon-operator-controller-manager-5b9b8895d5-dzzdm\" (UID: \"109ec0d2-04bf-4476-b14c-51249361da38\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297071 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnmqm\" (UniqueName: \"kubernetes.io/projected/a21a637b-e5c6-47ab-a41e-9622452be17e-kube-api-access-bnmqm\") pod \"ironic-operator-controller-manager-554564d7fc-56c7w\" (UID: \"a21a637b-e5c6-47ab-a41e-9622452be17e\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297111 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47vng\" (UniqueName: \"kubernetes.io/projected/371eef1d-3e55-48bb-8b14-f2c36fbc5689-kube-api-access-47vng\") pod \"neutron-operator-controller-manager-6bd4687957-rlcpj\" (UID: \"371eef1d-3e55-48bb-8b14-f2c36fbc5689\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297160 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrck5\" (UniqueName: \"kubernetes.io/projected/e42d4e0f-1071-4cb4-b9ff-90d02236a1a2-kube-api-access-vrck5\") pod \"manila-operator-controller-manager-67d996989d-v5zlv\" (UID: \"e42d4e0f-1071-4cb4-b9ff-90d02236a1a2\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297256 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr4rz\" (UniqueName: \"kubernetes.io/projected/873eb62b-74db-41cc-8249-3578cf2f59b4-kube-api-access-zr4rz\") pod \"mariadb-operator-controller-manager-6994f66f48-6sm8h\" (UID: \"873eb62b-74db-41cc-8249-3578cf2f59b4\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297306 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7vnx\" (UniqueName: \"kubernetes.io/projected/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-kube-api-access-b7vnx\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297441 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gscj\" (UniqueName: \"kubernetes.io/projected/b96ea9ca-8ca1-41aa-af25-a184c79bf18f-kube-api-access-2gscj\") pod \"keystone-operator-controller-manager-b4d948c87-mxqjv\" (UID: \"b96ea9ca-8ca1-41aa-af25-a184c79bf18f\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:10:48 crc kubenswrapper[4722]: E0226 20:10:48.297521 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:48 crc kubenswrapper[4722]: E0226 20:10:48.297585 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert podName:2a379e8a-c5df-465e-8b23-6b9ee6c874f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:48.797562682 +0000 UTC m=+991.334530606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert") pod "infra-operator-controller-manager-79d975b745-dhv4g" (UID: "2a379e8a-c5df-465e-8b23-6b9ee6c874f9") : secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.302932 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.304710 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.308763 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.319254 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75r2k\" (UniqueName: \"kubernetes.io/projected/109ec0d2-04bf-4476-b14c-51249361da38-kube-api-access-75r2k\") pod \"horizon-operator-controller-manager-5b9b8895d5-dzzdm\" (UID: \"109ec0d2-04bf-4476-b14c-51249361da38\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.319700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7vnx\" (UniqueName: \"kubernetes.io/projected/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-kube-api-access-b7vnx\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.320574 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.321391 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.324964 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-fdr2k" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.329411 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.332491 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.333987 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-hh2jx" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.342080 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.357328 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.366515 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.366874 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.371628 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.373047 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.398721 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.400262 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bh4dd" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.401157 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.405866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gscj\" (UniqueName: \"kubernetes.io/projected/b96ea9ca-8ca1-41aa-af25-a184c79bf18f-kube-api-access-2gscj\") pod \"keystone-operator-controller-manager-b4d948c87-mxqjv\" (UID: \"b96ea9ca-8ca1-41aa-af25-a184c79bf18f\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.406185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnmqm\" (UniqueName: \"kubernetes.io/projected/a21a637b-e5c6-47ab-a41e-9622452be17e-kube-api-access-bnmqm\") pod \"ironic-operator-controller-manager-554564d7fc-56c7w\" (UID: \"a21a637b-e5c6-47ab-a41e-9622452be17e\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.406406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47vng\" (UniqueName: \"kubernetes.io/projected/371eef1d-3e55-48bb-8b14-f2c36fbc5689-kube-api-access-47vng\") pod \"neutron-operator-controller-manager-6bd4687957-rlcpj\" (UID: \"371eef1d-3e55-48bb-8b14-f2c36fbc5689\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.406536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrck5\" (UniqueName: \"kubernetes.io/projected/e42d4e0f-1071-4cb4-b9ff-90d02236a1a2-kube-api-access-vrck5\") pod \"manila-operator-controller-manager-67d996989d-v5zlv\" (UID: \"e42d4e0f-1071-4cb4-b9ff-90d02236a1a2\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.406635 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr4rz\" (UniqueName: \"kubernetes.io/projected/873eb62b-74db-41cc-8249-3578cf2f59b4-kube-api-access-zr4rz\") pod \"mariadb-operator-controller-manager-6994f66f48-6sm8h\" (UID: \"873eb62b-74db-41cc-8249-3578cf2f59b4\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.409836 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.416502 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.416607 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.421575 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2vxkv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.449926 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrck5\" (UniqueName: \"kubernetes.io/projected/e42d4e0f-1071-4cb4-b9ff-90d02236a1a2-kube-api-access-vrck5\") pod \"manila-operator-controller-manager-67d996989d-v5zlv\" (UID: \"e42d4e0f-1071-4cb4-b9ff-90d02236a1a2\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.452525 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr4rz\" (UniqueName: \"kubernetes.io/projected/873eb62b-74db-41cc-8249-3578cf2f59b4-kube-api-access-zr4rz\") pod \"mariadb-operator-controller-manager-6994f66f48-6sm8h\" (UID: \"873eb62b-74db-41cc-8249-3578cf2f59b4\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.462964 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnmqm\" (UniqueName: \"kubernetes.io/projected/a21a637b-e5c6-47ab-a41e-9622452be17e-kube-api-access-bnmqm\") pod \"ironic-operator-controller-manager-554564d7fc-56c7w\" (UID: \"a21a637b-e5c6-47ab-a41e-9622452be17e\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.463120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gscj\" (UniqueName: \"kubernetes.io/projected/b96ea9ca-8ca1-41aa-af25-a184c79bf18f-kube-api-access-2gscj\") pod \"keystone-operator-controller-manager-b4d948c87-mxqjv\" (UID: \"b96ea9ca-8ca1-41aa-af25-a184c79bf18f\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.464328 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47vng\" (UniqueName: \"kubernetes.io/projected/371eef1d-3e55-48bb-8b14-f2c36fbc5689-kube-api-access-47vng\") pod \"neutron-operator-controller-manager-6bd4687957-rlcpj\" (UID: \"371eef1d-3e55-48bb-8b14-f2c36fbc5689\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.497888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.499262 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.516853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.516912 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fhx7\" (UniqueName: \"kubernetes.io/projected/6bc05a1e-4ace-47bc-af66-42c44dc19b80-kube-api-access-8fhx7\") pod \"nova-operator-controller-manager-567668f5cf-qjxzz\" (UID: \"6bc05a1e-4ace-47bc-af66-42c44dc19b80\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.516957 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnjzj\" (UniqueName: \"kubernetes.io/projected/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-kube-api-access-vnjzj\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.516981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfgpc\" (UniqueName: \"kubernetes.io/projected/710dce51-9c0f-4b66-9f5e-39cfe744f275-kube-api-access-dfgpc\") pod \"ovn-operator-controller-manager-5955d8c787-c5544\" (UID: \"710dce51-9c0f-4b66-9f5e-39cfe744f275\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.517057 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbmz\" (UniqueName: \"kubernetes.io/projected/5d4b2367-21d7-4be2-a83b-1932bd988df5-kube-api-access-snbmz\") pod \"octavia-operator-controller-manager-659dc6bbfc-tm8j8\" (UID: \"5d4b2367-21d7-4be2-a83b-1932bd988df5\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.555451 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.556990 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.605761 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8s6bn" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.606422 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.607479 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.608099 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.609538 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-hk77b" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.610051 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.618520 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fhx7\" (UniqueName: \"kubernetes.io/projected/6bc05a1e-4ace-47bc-af66-42c44dc19b80-kube-api-access-8fhx7\") pod \"nova-operator-controller-manager-567668f5cf-qjxzz\" (UID: \"6bc05a1e-4ace-47bc-af66-42c44dc19b80\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.618557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnjzj\" (UniqueName: \"kubernetes.io/projected/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-kube-api-access-vnjzj\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.618581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfgpc\" (UniqueName: \"kubernetes.io/projected/710dce51-9c0f-4b66-9f5e-39cfe744f275-kube-api-access-dfgpc\") pod \"ovn-operator-controller-manager-5955d8c787-c5544\" (UID: \"710dce51-9c0f-4b66-9f5e-39cfe744f275\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.618654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbmz\" (UniqueName: \"kubernetes.io/projected/5d4b2367-21d7-4be2-a83b-1932bd988df5-kube-api-access-snbmz\") pod \"octavia-operator-controller-manager-659dc6bbfc-tm8j8\" (UID: \"5d4b2367-21d7-4be2-a83b-1932bd988df5\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.618682 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:48 crc kubenswrapper[4722]: E0226 20:10:48.618792 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:48 crc kubenswrapper[4722]: E0226 20:10:48.619922 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert podName:7e0beaae-8f5c-4504-9d2a-1b32980e4f37 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:49.119905898 +0000 UTC m=+991.656873822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" (UID: "7e0beaae-8f5c-4504-9d2a-1b32980e4f37") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.632715 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.645906 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fhx7\" (UniqueName: \"kubernetes.io/projected/6bc05a1e-4ace-47bc-af66-42c44dc19b80-kube-api-access-8fhx7\") pod \"nova-operator-controller-manager-567668f5cf-qjxzz\" (UID: \"6bc05a1e-4ace-47bc-af66-42c44dc19b80\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.650669 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.651651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfgpc\" (UniqueName: \"kubernetes.io/projected/710dce51-9c0f-4b66-9f5e-39cfe744f275-kube-api-access-dfgpc\") pod \"ovn-operator-controller-manager-5955d8c787-c5544\" (UID: \"710dce51-9c0f-4b66-9f5e-39cfe744f275\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.657610 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnjzj\" (UniqueName: \"kubernetes.io/projected/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-kube-api-access-vnjzj\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.664961 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.665304 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.666848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbmz\" (UniqueName: \"kubernetes.io/projected/5d4b2367-21d7-4be2-a83b-1932bd988df5-kube-api-access-snbmz\") pod \"octavia-operator-controller-manager-659dc6bbfc-tm8j8\" (UID: \"5d4b2367-21d7-4be2-a83b-1932bd988df5\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.674784 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.683692 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.720694 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqn28\" (UniqueName: \"kubernetes.io/projected/4b98eee6-c514-4ca3-8544-a6978b6ed230-kube-api-access-rqn28\") pod \"swift-operator-controller-manager-68f46476f-pwtl7\" (UID: \"4b98eee6-c514-4ca3-8544-a6978b6ed230\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.720799 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cgjw\" (UniqueName: \"kubernetes.io/projected/2efbc411-9d10-4261-952f-5b97cbdc9e48-kube-api-access-9cgjw\") pod \"placement-operator-controller-manager-8497b45c89-mrjvd\" (UID: \"2efbc411-9d10-4261-952f-5b97cbdc9e48\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.721035 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.764715 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.770278 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.771320 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.773523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-dvbqg" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.798625 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.822368 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.823154 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.823220 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:10:48 crc kubenswrapper[4722]: E0226 20:10:48.823520 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:48 crc kubenswrapper[4722]: E0226 20:10:48.823559 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert podName:2a379e8a-c5df-465e-8b23-6b9ee6c874f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:49.823545276 +0000 UTC m=+992.360513200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert") pod "infra-operator-controller-manager-79d975b745-dhv4g" (UID: "2a379e8a-c5df-465e-8b23-6b9ee6c874f9") : secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.823217 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cgjw\" (UniqueName: \"kubernetes.io/projected/2efbc411-9d10-4261-952f-5b97cbdc9e48-kube-api-access-9cgjw\") pod \"placement-operator-controller-manager-8497b45c89-mrjvd\" (UID: \"2efbc411-9d10-4261-952f-5b97cbdc9e48\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.823866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqn28\" (UniqueName: \"kubernetes.io/projected/4b98eee6-c514-4ca3-8544-a6978b6ed230-kube-api-access-rqn28\") pod \"swift-operator-controller-manager-68f46476f-pwtl7\" (UID: \"4b98eee6-c514-4ca3-8544-a6978b6ed230\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.828031 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-7sf9w" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.846284 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cgjw\" (UniqueName: \"kubernetes.io/projected/2efbc411-9d10-4261-952f-5b97cbdc9e48-kube-api-access-9cgjw\") pod \"placement-operator-controller-manager-8497b45c89-mrjvd\" (UID: \"2efbc411-9d10-4261-952f-5b97cbdc9e48\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.856407 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqn28\" (UniqueName: \"kubernetes.io/projected/4b98eee6-c514-4ca3-8544-a6978b6ed230-kube-api-access-rqn28\") pod \"swift-operator-controller-manager-68f46476f-pwtl7\" (UID: \"4b98eee6-c514-4ca3-8544-a6978b6ed230\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.897672 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.924927 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ddsw\" (UniqueName: \"kubernetes.io/projected/532a7206-b336-4471-b9ad-c009c9395015-kube-api-access-2ddsw\") pod \"test-operator-controller-manager-5dc6794d5b-lrk22\" (UID: \"532a7206-b336-4471-b9ad-c009c9395015\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.924974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv6wl\" (UniqueName: \"kubernetes.io/projected/2bcd6197-b9a9-4330-a25f-aab80685aa27-kube-api-access-hv6wl\") pod \"telemetry-operator-controller-manager-85bcd67d77-fkpjs\" (UID: \"2bcd6197-b9a9-4330-a25f-aab80685aa27\") " pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.935443 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.936377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.938465 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-wr5gh" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.951944 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.953839 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.974336 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.996241 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.015229 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.016954 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.025557 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.026198 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-6dk85" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.026356 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz9r7\" (UniqueName: \"kubernetes.io/projected/c3c3e040-3df2-4b02-9d09-a76bcc90b882-kube-api-access-vz9r7\") pod \"watcher-operator-controller-manager-bccc79885-vqjv6\" (UID: \"c3c3e040-3df2-4b02-9d09-a76bcc90b882\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.026421 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ddsw\" (UniqueName: \"kubernetes.io/projected/532a7206-b336-4471-b9ad-c009c9395015-kube-api-access-2ddsw\") pod \"test-operator-controller-manager-5dc6794d5b-lrk22\" (UID: \"532a7206-b336-4471-b9ad-c009c9395015\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.026443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv6wl\" (UniqueName: \"kubernetes.io/projected/2bcd6197-b9a9-4330-a25f-aab80685aa27-kube-api-access-hv6wl\") pod \"telemetry-operator-controller-manager-85bcd67d77-fkpjs\" (UID: \"2bcd6197-b9a9-4330-a25f-aab80685aa27\") " pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.026622 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.026817 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.063498 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv6wl\" (UniqueName: \"kubernetes.io/projected/2bcd6197-b9a9-4330-a25f-aab80685aa27-kube-api-access-hv6wl\") pod \"telemetry-operator-controller-manager-85bcd67d77-fkpjs\" (UID: \"2bcd6197-b9a9-4330-a25f-aab80685aa27\") " pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.063725 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.064859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ddsw\" (UniqueName: \"kubernetes.io/projected/532a7206-b336-4471-b9ad-c009c9395015-kube-api-access-2ddsw\") pod \"test-operator-controller-manager-5dc6794d5b-lrk22\" (UID: \"532a7206-b336-4471-b9ad-c009c9395015\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.094205 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.107633 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.133481 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djgln\" (UniqueName: \"kubernetes.io/projected/c7d97484-b285-458e-94f4-3bd8700a25d7-kube-api-access-djgln\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.133609 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.133685 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.133740 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz9r7\" (UniqueName: \"kubernetes.io/projected/c3c3e040-3df2-4b02-9d09-a76bcc90b882-kube-api-access-vz9r7\") pod \"watcher-operator-controller-manager-bccc79885-vqjv6\" (UID: \"c3c3e040-3df2-4b02-9d09-a76bcc90b882\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.133806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.134033 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.134102 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert podName:7e0beaae-8f5c-4504-9d2a-1b32980e4f37 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:50.13408307 +0000 UTC m=+992.671050994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" (UID: "7e0beaae-8f5c-4504-9d2a-1b32980e4f37") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.158962 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.177330 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.179420 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.186900 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-gbxvn" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.204416 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz9r7\" (UniqueName: \"kubernetes.io/projected/c3c3e040-3df2-4b02-9d09-a76bcc90b882-kube-api-access-vz9r7\") pod \"watcher-operator-controller-manager-bccc79885-vqjv6\" (UID: \"c3c3e040-3df2-4b02-9d09-a76bcc90b882\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.213068 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.242429 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.246082 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.246312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.246336 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:49.746314102 +0000 UTC m=+992.283282016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "metrics-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.246417 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.246455 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djgln\" (UniqueName: \"kubernetes.io/projected/c7d97484-b285-458e-94f4-3bd8700a25d7-kube-api-access-djgln\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.246468 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:49.746451056 +0000 UTC m=+992.283418970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.254232 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.270994 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djgln\" (UniqueName: \"kubernetes.io/projected/c7d97484-b285-458e-94f4-3bd8700a25d7-kube-api-access-djgln\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.273470 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.304829 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.331207 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.343746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.348735 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtx6n\" (UniqueName: \"kubernetes.io/projected/50694186-e31c-499d-ba48-e5818eeceee5-kube-api-access-wtx6n\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hhp7x\" (UID: \"50694186-e31c-499d-ba48-e5818eeceee5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.449882 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtx6n\" (UniqueName: \"kubernetes.io/projected/50694186-e31c-499d-ba48-e5818eeceee5-kube-api-access-wtx6n\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hhp7x\" (UID: \"50694186-e31c-499d-ba48-e5818eeceee5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.472195 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtx6n\" (UniqueName: \"kubernetes.io/projected/50694186-e31c-499d-ba48-e5818eeceee5-kube-api-access-wtx6n\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hhp7x\" (UID: \"50694186-e31c-499d-ba48-e5818eeceee5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.514542 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv"] Feb 26 20:10:49 crc kubenswrapper[4722]: W0226 20:10:49.530820 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode42d4e0f_1071_4cb4_b9ff_90d02236a1a2.slice/crio-5c490bc5c8d223ac5e009cb5119b09647376574bb2fb0a772780dea85fdca8ef WatchSource:0}: Error finding container 5c490bc5c8d223ac5e009cb5119b09647376574bb2fb0a772780dea85fdca8ef: Status 404 returned error can't find the container with id 5c490bc5c8d223ac5e009cb5119b09647376574bb2fb0a772780dea85fdca8ef Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.591543 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.653064 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" event={"ID":"a21a637b-e5c6-47ab-a41e-9622452be17e","Type":"ContainerStarted","Data":"910decc3a2dbe3f4d34a00caa4740dd651b7c9be52e3d09f14af6f1420082871"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.658346 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" event={"ID":"c59c3e1b-9d18-45eb-a409-bd2176527063","Type":"ContainerStarted","Data":"d789993dec7ecb3e23dc41f45a7aaa9e064c1a53e37f5adb75aed89db7d34fa8"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.659063 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" event={"ID":"71fdb02f-7fa5-4151-bec9-7e7d3ac072dd","Type":"ContainerStarted","Data":"dbe0c345be178a0529cd458c622b1074e251ea17341b5cab0305db7490ec1c09"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.660216 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" event={"ID":"f6b9ed59-4089-4a80-bdae-368d169363f2","Type":"ContainerStarted","Data":"a05dd70939a7d07ea4a3b82c69949134932d07219871cffde1887e0fade6f33e"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.660919 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" event={"ID":"e42d4e0f-1071-4cb4-b9ff-90d02236a1a2","Type":"ContainerStarted","Data":"5c490bc5c8d223ac5e009cb5119b09647376574bb2fb0a772780dea85fdca8ef"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.661634 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" event={"ID":"109ec0d2-04bf-4476-b14c-51249361da38","Type":"ContainerStarted","Data":"69a042b1f50840ca58d10045cfd4e39bcf5e91ed55a5e740aa2fb4335b0d3d47"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.661830 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.669606 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" event={"ID":"604550ce-766e-48bb-a0a7-d14b7708a44e","Type":"ContainerStarted","Data":"6551850d04e674228b06381d363b13edc46b766887fc7fa69ef8ec0daf0c2a58"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.673423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" event={"ID":"a2804dbe-f9c5-4aca-b3f5-6392d2bc20db","Type":"ContainerStarted","Data":"9cd0c1b4341e3fc7abfdad9d394d3266797dc4f546d475a1af4115ecd2b9eb11"} Feb 26 20:10:49 crc kubenswrapper[4722]: W0226 20:10:49.686839 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb96ea9ca_8ca1_41aa_af25_a184c79bf18f.slice/crio-ff68c7f58e9627afe337374d7e2d085b58b6516b748ae36072af230e0addd237 WatchSource:0}: Error finding container ff68c7f58e9627afe337374d7e2d085b58b6516b748ae36072af230e0addd237: Status 404 returned error can't find the container with id ff68c7f58e9627afe337374d7e2d085b58b6516b748ae36072af230e0addd237 Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.697399 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8"] Feb 26 20:10:49 crc kubenswrapper[4722]: W0226 20:10:49.718636 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d4b2367_21d7_4be2_a83b_1932bd988df5.slice/crio-6c176d8636f9bce5e2e1c9d1666d2887dfc53be0ec3b2ca32c873437712088fc WatchSource:0}: Error finding container 6c176d8636f9bce5e2e1c9d1666d2887dfc53be0ec3b2ca32c873437712088fc: Status 404 returned error can't find the container with id 6c176d8636f9bce5e2e1c9d1666d2887dfc53be0ec3b2ca32c873437712088fc Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.755130 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.755281 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.755319 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.755361 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.755398 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:50.755364952 +0000 UTC m=+993.292332876 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.755413 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:50.755407584 +0000 UTC m=+993.292375508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "metrics-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.802261 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.808835 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.858985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.859151 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.859198 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert podName:2a379e8a-c5df-465e-8b23-6b9ee6c874f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:51.859184515 +0000 UTC m=+994.396152439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert") pod "infra-operator-controller-manager-79d975b745-dhv4g" (UID: "2a379e8a-c5df-465e-8b23-6b9ee6c874f9") : secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.918592 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.963275 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.971357 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj"] Feb 26 20:10:49 crc kubenswrapper[4722]: W0226 20:10:49.974450 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bcd6197_b9a9_4330_a25f_aab80685aa27.slice/crio-be41771f81352af16f205a445d0d95370e4c9f9d308213ae160ccd35399b1fec WatchSource:0}: Error finding container be41771f81352af16f205a445d0d95370e4c9f9d308213ae160ccd35399b1fec: Status 404 returned error can't find the container with id be41771f81352af16f205a445d0d95370e4c9f9d308213ae160ccd35399b1fec Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.977071 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.981864 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs"] Feb 26 20:10:49 crc kubenswrapper[4722]: W0226 20:10:49.982434 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod371eef1d_3e55_48bb_8b14_f2c36fbc5689.slice/crio-fc48f2f077448fccc9189cdf9e0b6731c8fcd0467007ad166680ad4d523bda5c WatchSource:0}: Error finding container fc48f2f077448fccc9189cdf9e0b6731c8fcd0467007ad166680ad4d523bda5c: Status 404 returned error can't find the container with id fc48f2f077448fccc9189cdf9e0b6731c8fcd0467007ad166680ad4d523bda5c Feb 26 20:10:49 crc kubenswrapper[4722]: W0226 20:10:49.991101 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2efbc411_9d10_4261_952f_5b97cbdc9e48.slice/crio-17d8a7121431b83ffaaa685189c34c10df4dfdafb728ca60006cbeb15173bac1 WatchSource:0}: Error finding container 17d8a7121431b83ffaaa685189c34c10df4dfdafb728ca60006cbeb15173bac1: Status 404 returned error can't find the container with id 17d8a7121431b83ffaaa685189c34c10df4dfdafb728ca60006cbeb15173bac1 Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.993956 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd"] Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.999191 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:39a4be8a175d9e84fa6ba159f906a95524540b13,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hv6wl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-85bcd67d77-fkpjs_openstack-operators(2bcd6197-b9a9-4330-a25f-aab80685aa27): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.000416 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" podUID="2bcd6197-b9a9-4330-a25f-aab80685aa27" Feb 26 20:10:50 crc kubenswrapper[4722]: W0226 20:10:50.000763 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b98eee6_c514_4ca3_8544_a6978b6ed230.slice/crio-b9517a3cf27c406d0d7c52d181203c94e99a0b9b0566da9cf4bdbb766e794237 WatchSource:0}: Error finding container b9517a3cf27c406d0d7c52d181203c94e99a0b9b0566da9cf4bdbb766e794237: Status 404 returned error can't find the container with id b9517a3cf27c406d0d7c52d181203c94e99a0b9b0566da9cf4bdbb766e794237 Feb 26 20:10:50 crc kubenswrapper[4722]: W0226 20:10:50.000975 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod710dce51_9c0f_4b66_9f5e_39cfe744f275.slice/crio-55fc4ac8b117091d01e7c074af26fe7d46ce07a6b61a3d24092d58daa47c77ca WatchSource:0}: Error finding container 55fc4ac8b117091d01e7c074af26fe7d46ce07a6b61a3d24092d58daa47c77ca: Status 404 returned error can't find the container with id 55fc4ac8b117091d01e7c074af26fe7d46ce07a6b61a3d24092d58daa47c77ca Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.005769 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rqn28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-pwtl7_openstack-operators(4b98eee6-c514-4ca3-8544-a6978b6ed230): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.006080 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dfgpc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-5955d8c787-c5544_openstack-operators(710dce51-9c0f-4b66-9f5e-39cfe744f275): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.006926 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" podUID="4b98eee6-c514-4ca3-8544-a6978b6ed230" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.007429 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" podUID="710dce51-9c0f-4b66-9f5e-39cfe744f275" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.119289 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6"] Feb 26 20:10:50 crc kubenswrapper[4722]: W0226 20:10:50.156105 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c3e040_3df2_4b02_9d09_a76bcc90b882.slice/crio-2f729c1d40948c2b652c81100281d83421196a5ef995a263519df2532f63ad95 WatchSource:0}: Error finding container 2f729c1d40948c2b652c81100281d83421196a5ef995a263519df2532f63ad95: Status 404 returned error can't find the container with id 2f729c1d40948c2b652c81100281d83421196a5ef995a263519df2532f63ad95 Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.165569 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.165748 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.165810 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert podName:7e0beaae-8f5c-4504-9d2a-1b32980e4f37 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:52.165792802 +0000 UTC m=+994.702760806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" (UID: "7e0beaae-8f5c-4504-9d2a-1b32980e4f37") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.215965 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x"] Feb 26 20:10:50 crc kubenswrapper[4722]: W0226 20:10:50.278970 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50694186_e31c_499d_ba48_e5818eeceee5.slice/crio-629242db6ed40011659b5984443acf007abb4aeb64b65db45110d0f82cfb831c WatchSource:0}: Error finding container 629242db6ed40011659b5984443acf007abb4aeb64b65db45110d0f82cfb831c: Status 404 returned error can't find the container with id 629242db6ed40011659b5984443acf007abb4aeb64b65db45110d0f82cfb831c Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.281758 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtx6n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-hhp7x_openstack-operators(50694186-e31c-499d-ba48-e5818eeceee5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.282953 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" podUID="50694186-e31c-499d-ba48-e5818eeceee5" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.741614 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" event={"ID":"873eb62b-74db-41cc-8249-3578cf2f59b4","Type":"ContainerStarted","Data":"38f3fc258d9d45cc848ba4e385559a22e6ff4506e34d284eaa4dfeba4bc99ff0"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.744370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" event={"ID":"b96ea9ca-8ca1-41aa-af25-a184c79bf18f","Type":"ContainerStarted","Data":"ff68c7f58e9627afe337374d7e2d085b58b6516b748ae36072af230e0addd237"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.745936 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" event={"ID":"2efbc411-9d10-4261-952f-5b97cbdc9e48","Type":"ContainerStarted","Data":"17d8a7121431b83ffaaa685189c34c10df4dfdafb728ca60006cbeb15173bac1"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.748886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" event={"ID":"710dce51-9c0f-4b66-9f5e-39cfe744f275","Type":"ContainerStarted","Data":"55fc4ac8b117091d01e7c074af26fe7d46ce07a6b61a3d24092d58daa47c77ca"} Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.750489 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" podUID="710dce51-9c0f-4b66-9f5e-39cfe744f275" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.758110 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" event={"ID":"c3c3e040-3df2-4b02-9d09-a76bcc90b882","Type":"ContainerStarted","Data":"2f729c1d40948c2b652c81100281d83421196a5ef995a263519df2532f63ad95"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.761507 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" event={"ID":"5d4b2367-21d7-4be2-a83b-1932bd988df5","Type":"ContainerStarted","Data":"6c176d8636f9bce5e2e1c9d1666d2887dfc53be0ec3b2ca32c873437712088fc"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.764276 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" event={"ID":"4b98eee6-c514-4ca3-8544-a6978b6ed230","Type":"ContainerStarted","Data":"b9517a3cf27c406d0d7c52d181203c94e99a0b9b0566da9cf4bdbb766e794237"} Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.765753 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" podUID="4b98eee6-c514-4ca3-8544-a6978b6ed230" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.776965 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.777400 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.777559 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.777628 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:52.777612447 +0000 UTC m=+995.314580371 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "metrics-server-cert" not found Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.777683 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.777707 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:52.77770087 +0000 UTC m=+995.314668794 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "webhook-server-cert" not found Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.785952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" event={"ID":"532a7206-b336-4471-b9ad-c009c9395015","Type":"ContainerStarted","Data":"c071d2c8dc405200d89e91d1a0f86ae3cfe9757353226790e0dbf2e8fefd9c05"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.797865 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" event={"ID":"6bc05a1e-4ace-47bc-af66-42c44dc19b80","Type":"ContainerStarted","Data":"454b87c11cecf84c1a6bab1011f3d359054f4094ff1a66dbe8faeb3584e8d43a"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.802725 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" event={"ID":"50694186-e31c-499d-ba48-e5818eeceee5","Type":"ContainerStarted","Data":"629242db6ed40011659b5984443acf007abb4aeb64b65db45110d0f82cfb831c"} Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.806364 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" podUID="50694186-e31c-499d-ba48-e5818eeceee5" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.808016 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" event={"ID":"2bcd6197-b9a9-4330-a25f-aab80685aa27","Type":"ContainerStarted","Data":"be41771f81352af16f205a445d0d95370e4c9f9d308213ae160ccd35399b1fec"} Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.812746 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:39a4be8a175d9e84fa6ba159f906a95524540b13\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" podUID="2bcd6197-b9a9-4330-a25f-aab80685aa27" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.815821 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" event={"ID":"371eef1d-3e55-48bb-8b14-f2c36fbc5689","Type":"ContainerStarted","Data":"fc48f2f077448fccc9189cdf9e0b6731c8fcd0467007ad166680ad4d523bda5c"} Feb 26 20:10:51 crc kubenswrapper[4722]: E0226 20:10:51.837253 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" podUID="710dce51-9c0f-4b66-9f5e-39cfe744f275" Feb 26 20:10:51 crc kubenswrapper[4722]: E0226 20:10:51.840292 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:39a4be8a175d9e84fa6ba159f906a95524540b13\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" podUID="2bcd6197-b9a9-4330-a25f-aab80685aa27" Feb 26 20:10:51 crc kubenswrapper[4722]: E0226 20:10:51.840663 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" podUID="4b98eee6-c514-4ca3-8544-a6978b6ed230" Feb 26 20:10:51 crc kubenswrapper[4722]: E0226 20:10:51.840719 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" podUID="50694186-e31c-499d-ba48-e5818eeceee5" Feb 26 20:10:51 crc kubenswrapper[4722]: I0226 20:10:51.899572 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:51 crc kubenswrapper[4722]: E0226 20:10:51.899803 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:51 crc kubenswrapper[4722]: E0226 20:10:51.900126 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert podName:2a379e8a-c5df-465e-8b23-6b9ee6c874f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:55.900105098 +0000 UTC m=+998.437073022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert") pod "infra-operator-controller-manager-79d975b745-dhv4g" (UID: "2a379e8a-c5df-465e-8b23-6b9ee6c874f9") : secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:52 crc kubenswrapper[4722]: I0226 20:10:52.204879 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:52 crc kubenswrapper[4722]: E0226 20:10:52.205454 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:52 crc kubenswrapper[4722]: E0226 20:10:52.205504 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert podName:7e0beaae-8f5c-4504-9d2a-1b32980e4f37 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:56.205489841 +0000 UTC m=+998.742457765 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" (UID: "7e0beaae-8f5c-4504-9d2a-1b32980e4f37") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:52 crc kubenswrapper[4722]: I0226 20:10:52.814278 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:52 crc kubenswrapper[4722]: I0226 20:10:52.814403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:52 crc kubenswrapper[4722]: E0226 20:10:52.814551 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 20:10:52 crc kubenswrapper[4722]: E0226 20:10:52.814604 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:56.814586122 +0000 UTC m=+999.351554046 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "webhook-server-cert" not found Feb 26 20:10:52 crc kubenswrapper[4722]: E0226 20:10:52.815077 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 20:10:52 crc kubenswrapper[4722]: E0226 20:10:52.815149 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:56.815121407 +0000 UTC m=+999.352089341 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "metrics-server-cert" not found Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.487022 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.487095 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.487172 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.487846 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28eb66ca582ac12b359d92edbe11f70ad050a32628a627f71feab854f56a89c5"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.487911 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://28eb66ca582ac12b359d92edbe11f70ad050a32628a627f71feab854f56a89c5" gracePeriod=600 Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.849000 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="28eb66ca582ac12b359d92edbe11f70ad050a32628a627f71feab854f56a89c5" exitCode=0 Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.849043 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"28eb66ca582ac12b359d92edbe11f70ad050a32628a627f71feab854f56a89c5"} Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.849073 4722 scope.go:117] "RemoveContainer" containerID="12e92002147a6bed28558e812784c0c72814bfcf24c4c83a3ce08703dfb08d58" Feb 26 20:10:55 crc kubenswrapper[4722]: I0226 20:10:55.960358 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:55 crc kubenswrapper[4722]: E0226 20:10:55.960547 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:55 crc kubenswrapper[4722]: E0226 20:10:55.960865 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert podName:2a379e8a-c5df-465e-8b23-6b9ee6c874f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:03.960846038 +0000 UTC m=+1006.497813952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert") pod "infra-operator-controller-manager-79d975b745-dhv4g" (UID: "2a379e8a-c5df-465e-8b23-6b9ee6c874f9") : secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:56 crc kubenswrapper[4722]: I0226 20:10:56.265377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:56 crc kubenswrapper[4722]: E0226 20:10:56.265552 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:56 crc kubenswrapper[4722]: E0226 20:10:56.265609 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert podName:7e0beaae-8f5c-4504-9d2a-1b32980e4f37 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:04.265592143 +0000 UTC m=+1006.802560077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" (UID: "7e0beaae-8f5c-4504-9d2a-1b32980e4f37") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:56 crc kubenswrapper[4722]: I0226 20:10:56.873300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:56 crc kubenswrapper[4722]: I0226 20:10:56.873536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:56 crc kubenswrapper[4722]: E0226 20:10:56.873641 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 20:10:56 crc kubenswrapper[4722]: E0226 20:10:56.873717 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:04.873697126 +0000 UTC m=+1007.410665121 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "webhook-server-cert" not found Feb 26 20:10:56 crc kubenswrapper[4722]: E0226 20:10:56.873721 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 20:10:56 crc kubenswrapper[4722]: E0226 20:10:56.873796 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:04.873777419 +0000 UTC m=+1007.410745343 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "metrics-server-cert" not found Feb 26 20:11:02 crc kubenswrapper[4722]: E0226 20:11:02.785515 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 26 20:11:02 crc kubenswrapper[4722]: E0226 20:11:02.788546 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8fhx7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-qjxzz_openstack-operators(6bc05a1e-4ace-47bc-af66-42c44dc19b80): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:11:02 crc kubenswrapper[4722]: E0226 20:11:02.806501 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" podUID="6bc05a1e-4ace-47bc-af66-42c44dc19b80" Feb 26 20:11:02 crc kubenswrapper[4722]: E0226 20:11:02.928065 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" podUID="6bc05a1e-4ace-47bc-af66-42c44dc19b80" Feb 26 20:11:03 crc kubenswrapper[4722]: E0226 20:11:03.464565 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 26 20:11:03 crc kubenswrapper[4722]: E0226 20:11:03.464750 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2gscj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-mxqjv_openstack-operators(b96ea9ca-8ca1-41aa-af25-a184c79bf18f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:11:03 crc kubenswrapper[4722]: E0226 20:11:03.466211 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" podUID="b96ea9ca-8ca1-41aa-af25-a184c79bf18f" Feb 26 20:11:03 crc kubenswrapper[4722]: E0226 20:11:03.956501 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" podUID="b96ea9ca-8ca1-41aa-af25-a184c79bf18f" Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.016318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:11:04 crc kubenswrapper[4722]: E0226 20:11:04.016548 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 20:11:04 crc kubenswrapper[4722]: E0226 20:11:04.016654 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert podName:2a379e8a-c5df-465e-8b23-6b9ee6c874f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:20.016631202 +0000 UTC m=+1022.553599126 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert") pod "infra-operator-controller-manager-79d975b745-dhv4g" (UID: "2a379e8a-c5df-465e-8b23-6b9ee6c874f9") : secret "infra-operator-webhook-server-cert" not found Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.322686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:11:04 crc kubenswrapper[4722]: E0226 20:11:04.323146 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:11:04 crc kubenswrapper[4722]: E0226 20:11:04.323194 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert podName:7e0beaae-8f5c-4504-9d2a-1b32980e4f37 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:20.323179608 +0000 UTC m=+1022.860147532 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" (UID: "7e0beaae-8f5c-4504-9d2a-1b32980e4f37") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.932756 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.933073 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:04 crc kubenswrapper[4722]: E0226 20:11:04.933700 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 20:11:04 crc kubenswrapper[4722]: E0226 20:11:04.933801 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:20.93378187 +0000 UTC m=+1023.470749794 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "webhook-server-cert" not found Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.940752 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.969347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" event={"ID":"873eb62b-74db-41cc-8249-3578cf2f59b4","Type":"ContainerStarted","Data":"45bb499e343d378f5fcd92c6fd1fbb01cabf301e2e5cb57a5eeb4e6ccfc3cb75"} Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.970086 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.985876 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" event={"ID":"f6b9ed59-4089-4a80-bdae-368d169363f2","Type":"ContainerStarted","Data":"eba3fbc9a766d8471b05123dade75c106f675b0813744d18be010838869a4b95"} Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.986390 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.992161 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" event={"ID":"e42d4e0f-1071-4cb4-b9ff-90d02236a1a2","Type":"ContainerStarted","Data":"6e761f2eaaa1e94e5f040f441dff652d149a082ccfddb15975f6231d8de4a565"} Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.992931 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.013257 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" event={"ID":"109ec0d2-04bf-4476-b14c-51249361da38","Type":"ContainerStarted","Data":"0a8ec44dfd9f689711af7fb16e4ceec9343b80d51e10b892b7841ab2a06aa6e9"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.013395 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.023004 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" event={"ID":"5d4b2367-21d7-4be2-a83b-1932bd988df5","Type":"ContainerStarted","Data":"6d1d175a70d6bbb9740244a655e4dee60e6d8828c1b4b8df0abcae4d8e4bff67"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.023829 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.041324 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" event={"ID":"532a7206-b336-4471-b9ad-c009c9395015","Type":"ContainerStarted","Data":"b2e8cf029c91006ae9ef91631be3b030fb2c7250197566bc89feabc9be4f6e8b"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.041993 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.056613 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" podStartSLOduration=3.833879372 podStartE2EDuration="18.05659734s" podCreationTimestamp="2026-02-26 20:10:47 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.186466858 +0000 UTC m=+991.723434782" lastFinishedPulling="2026-02-26 20:11:03.409184826 +0000 UTC m=+1005.946152750" observedRunningTime="2026-02-26 20:11:05.052116949 +0000 UTC m=+1007.589084863" watchObservedRunningTime="2026-02-26 20:11:05.05659734 +0000 UTC m=+1007.593565264" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.057315 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" podStartSLOduration=3.460156683 podStartE2EDuration="17.057311361s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.810900138 +0000 UTC m=+992.347868072" lastFinishedPulling="2026-02-26 20:11:03.408054826 +0000 UTC m=+1005.945022750" observedRunningTime="2026-02-26 20:11:05.00085983 +0000 UTC m=+1007.537827754" watchObservedRunningTime="2026-02-26 20:11:05.057311361 +0000 UTC m=+1007.594279285" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.058075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" event={"ID":"604550ce-766e-48bb-a0a7-d14b7708a44e","Type":"ContainerStarted","Data":"e7b7d7ab23796e384dd48192aac71f7712fbb69bd06ed3a5c1cb8edadee6dc76"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.058867 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.088603 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" event={"ID":"a2804dbe-f9c5-4aca-b3f5-6392d2bc20db","Type":"ContainerStarted","Data":"8d3dbedf96a2151a528620f8ee3570904f886d249fb0f77675ca4988e9e6d71d"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.089547 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.108293 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" event={"ID":"a21a637b-e5c6-47ab-a41e-9622452be17e","Type":"ContainerStarted","Data":"d1ca2032f1ed92ddca46d8e00ac7af0c607bfb7e5f9c955f9c39a8c935a44014"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.108869 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.117671 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" event={"ID":"71fdb02f-7fa5-4151-bec9-7e7d3ac072dd","Type":"ContainerStarted","Data":"490c048b00bdbb90d02254288cfe7e680e39defcef000d418aacb49e7ae0eedd"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.117824 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.127364 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" podStartSLOduration=3.232188083 podStartE2EDuration="17.127342212s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.545901417 +0000 UTC m=+992.082869341" lastFinishedPulling="2026-02-26 20:11:03.441055546 +0000 UTC m=+1005.978023470" observedRunningTime="2026-02-26 20:11:05.1038316 +0000 UTC m=+1007.640799524" watchObservedRunningTime="2026-02-26 20:11:05.127342212 +0000 UTC m=+1007.664310136" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.132368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"0c21285f0689404c517f73494c8146ae2d9c77c8869bf3913d36029a321066ed"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.133357 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" podStartSLOduration=3.023710293 podStartE2EDuration="17.133339275s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.342456965 +0000 UTC m=+991.879424889" lastFinishedPulling="2026-02-26 20:11:03.452085947 +0000 UTC m=+1005.989053871" observedRunningTime="2026-02-26 20:11:05.126370865 +0000 UTC m=+1007.663338789" watchObservedRunningTime="2026-02-26 20:11:05.133339275 +0000 UTC m=+1007.670307199" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.144386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" event={"ID":"c59c3e1b-9d18-45eb-a409-bd2176527063","Type":"ContainerStarted","Data":"69800daaac16bd9abec84f5a4e3c3f545f293a33e78d4b95f7705e32c98584e2"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.145075 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.146734 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" event={"ID":"c3c3e040-3df2-4b02-9d09-a76bcc90b882","Type":"ContainerStarted","Data":"069cd3bbb87cafd128eb47a0ad3f178b609283c8d8d9a6e6a852bb2152c2ad0c"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.146913 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.150842 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" event={"ID":"371eef1d-3e55-48bb-8b14-f2c36fbc5689","Type":"ContainerStarted","Data":"ec147c0956141daedddb0d0a95a3a6c42a096e1ca95abb83dc461a14221853ca"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.151270 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.170287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" event={"ID":"2efbc411-9d10-4261-952f-5b97cbdc9e48","Type":"ContainerStarted","Data":"a619d0b6b00b5ba83c887e490097c1068d2413500695e174d707ff17d336428f"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.170855 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.176892 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" podStartSLOduration=4.140203582 podStartE2EDuration="18.176877393s" podCreationTimestamp="2026-02-26 20:10:47 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.371298412 +0000 UTC m=+991.908266336" lastFinishedPulling="2026-02-26 20:11:03.407972223 +0000 UTC m=+1005.944940147" observedRunningTime="2026-02-26 20:11:05.15147798 +0000 UTC m=+1007.688445904" watchObservedRunningTime="2026-02-26 20:11:05.176877393 +0000 UTC m=+1007.713845317" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.179159 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" podStartSLOduration=3.660955783 podStartE2EDuration="17.179151755s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.940782923 +0000 UTC m=+992.477750847" lastFinishedPulling="2026-02-26 20:11:03.458978895 +0000 UTC m=+1005.995946819" observedRunningTime="2026-02-26 20:11:05.176853702 +0000 UTC m=+1007.713821626" watchObservedRunningTime="2026-02-26 20:11:05.179151755 +0000 UTC m=+1007.716119689" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.202769 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" podStartSLOduration=4.131303637 podStartE2EDuration="17.20275353s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.72192618 +0000 UTC m=+992.258894104" lastFinishedPulling="2026-02-26 20:11:02.793376073 +0000 UTC m=+1005.330343997" observedRunningTime="2026-02-26 20:11:05.197841195 +0000 UTC m=+1007.734809129" watchObservedRunningTime="2026-02-26 20:11:05.20275353 +0000 UTC m=+1007.739721454" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.235785 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" podStartSLOduration=4.059351985 podStartE2EDuration="18.2357686s" podCreationTimestamp="2026-02-26 20:10:47 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.283666171 +0000 UTC m=+991.820634095" lastFinishedPulling="2026-02-26 20:11:03.460082786 +0000 UTC m=+1005.997050710" observedRunningTime="2026-02-26 20:11:05.230755034 +0000 UTC m=+1007.767722958" watchObservedRunningTime="2026-02-26 20:11:05.2357686 +0000 UTC m=+1007.772736524" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.284305 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" podStartSLOduration=3.842873617 podStartE2EDuration="17.284273734s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.998811696 +0000 UTC m=+992.535779620" lastFinishedPulling="2026-02-26 20:11:03.440211813 +0000 UTC m=+1005.977179737" observedRunningTime="2026-02-26 20:11:05.249429593 +0000 UTC m=+1007.786397527" watchObservedRunningTime="2026-02-26 20:11:05.284273734 +0000 UTC m=+1007.821241668" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.293268 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" podStartSLOduration=3.996245832 podStartE2EDuration="17.293244209s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:50.163312185 +0000 UTC m=+992.700280109" lastFinishedPulling="2026-02-26 20:11:03.460310562 +0000 UTC m=+1005.997278486" observedRunningTime="2026-02-26 20:11:05.288516519 +0000 UTC m=+1007.825484453" watchObservedRunningTime="2026-02-26 20:11:05.293244209 +0000 UTC m=+1007.830212143" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.356410 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" podStartSLOduration=4.513181089 podStartE2EDuration="18.356386111s" podCreationTimestamp="2026-02-26 20:10:47 +0000 UTC" firstStartedPulling="2026-02-26 20:10:48.950653804 +0000 UTC m=+991.487621718" lastFinishedPulling="2026-02-26 20:11:02.793858816 +0000 UTC m=+1005.330826740" observedRunningTime="2026-02-26 20:11:05.313655515 +0000 UTC m=+1007.850623439" watchObservedRunningTime="2026-02-26 20:11:05.356386111 +0000 UTC m=+1007.893354035" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.367176 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" podStartSLOduration=4.601137628 podStartE2EDuration="17.367158215s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.998855697 +0000 UTC m=+992.535823621" lastFinishedPulling="2026-02-26 20:11:02.764876284 +0000 UTC m=+1005.301844208" observedRunningTime="2026-02-26 20:11:05.349880644 +0000 UTC m=+1007.886848578" watchObservedRunningTime="2026-02-26 20:11:05.367158215 +0000 UTC m=+1007.904126149" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.406660 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" podStartSLOduration=3.33833075 podStartE2EDuration="17.406643803s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.371995012 +0000 UTC m=+991.908962936" lastFinishedPulling="2026-02-26 20:11:03.440308065 +0000 UTC m=+1005.977275989" observedRunningTime="2026-02-26 20:11:05.406114628 +0000 UTC m=+1007.943082552" watchObservedRunningTime="2026-02-26 20:11:05.406643803 +0000 UTC m=+1007.943611727" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.428889 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" podStartSLOduration=4.6858118 podStartE2EDuration="18.428873769s" podCreationTimestamp="2026-02-26 20:10:47 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.020446968 +0000 UTC m=+991.557414892" lastFinishedPulling="2026-02-26 20:11:02.763508937 +0000 UTC m=+1005.300476861" observedRunningTime="2026-02-26 20:11:05.427964955 +0000 UTC m=+1007.964932879" watchObservedRunningTime="2026-02-26 20:11:05.428873769 +0000 UTC m=+1007.965841693" Feb 26 20:11:08 crc kubenswrapper[4722]: I0226 20:11:08.655884 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:11:09 crc kubenswrapper[4722]: I0226 20:11:09.182716 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:11:09 crc kubenswrapper[4722]: I0226 20:11:09.284722 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.222830 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" event={"ID":"4b98eee6-c514-4ca3-8544-a6978b6ed230","Type":"ContainerStarted","Data":"cc26c747c162c87bc512c50a1e7ffdfb3a4b5e38f9f232b5fc5bb9ed43423c0f"} Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.224509 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.224649 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" event={"ID":"710dce51-9c0f-4b66-9f5e-39cfe744f275","Type":"ContainerStarted","Data":"850c336695c41498486c644424e031c440416b589d302465a5a41da050fc2a1e"} Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.224865 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.226501 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" event={"ID":"50694186-e31c-499d-ba48-e5818eeceee5","Type":"ContainerStarted","Data":"65c5c46b6c3deedf56f2d40ffc27338b27e04502a5cebfbfd7a29c3abc658f05"} Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.227917 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" event={"ID":"2bcd6197-b9a9-4330-a25f-aab80685aa27","Type":"ContainerStarted","Data":"a95a91a79447f35400a8a3a638299798be47aaca8228fd3a0185b3b60e0cc270"} Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.228103 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.243627 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" podStartSLOduration=2.646779607 podStartE2EDuration="23.243609461s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:50.005600491 +0000 UTC m=+992.542568415" lastFinishedPulling="2026-02-26 20:11:10.602430345 +0000 UTC m=+1013.139398269" observedRunningTime="2026-02-26 20:11:11.241109213 +0000 UTC m=+1013.778077157" watchObservedRunningTime="2026-02-26 20:11:11.243609461 +0000 UTC m=+1013.780577385" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.273642 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" podStartSLOduration=2.641214126 podStartE2EDuration="23.273624491s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:50.005890439 +0000 UTC m=+992.542858363" lastFinishedPulling="2026-02-26 20:11:10.638300804 +0000 UTC m=+1013.175268728" observedRunningTime="2026-02-26 20:11:11.270086873 +0000 UTC m=+1013.807054797" watchObservedRunningTime="2026-02-26 20:11:11.273624491 +0000 UTC m=+1013.810592415" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.275018 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" podStartSLOduration=2.697344566 podStartE2EDuration="23.275011188s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.999062752 +0000 UTC m=+992.536030676" lastFinishedPulling="2026-02-26 20:11:10.576729374 +0000 UTC m=+1013.113697298" observedRunningTime="2026-02-26 20:11:11.258721853 +0000 UTC m=+1013.795689787" watchObservedRunningTime="2026-02-26 20:11:11.275011188 +0000 UTC m=+1013.811979112" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.284288 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" podStartSLOduration=2.938539589 podStartE2EDuration="23.284271561s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:50.281631633 +0000 UTC m=+992.818599557" lastFinishedPulling="2026-02-26 20:11:10.627363605 +0000 UTC m=+1013.164331529" observedRunningTime="2026-02-26 20:11:11.283330545 +0000 UTC m=+1013.820298479" watchObservedRunningTime="2026-02-26 20:11:11.284271561 +0000 UTC m=+1013.821239485" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.268469 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.306848 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.308558 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.371331 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.415298 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.502078 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.610442 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.616609 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.680378 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.723181 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.770428 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.956813 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.976743 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:11:19 crc kubenswrapper[4722]: I0226 20:11:19.110094 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.111177 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.118540 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.321831 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-tjl4b" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.330596 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.418695 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.427623 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.553005 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bh4dd" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.560918 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:11:20 crc kubenswrapper[4722]: W0226 20:11:20.608320 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a379e8a_c5df_465e_8b23_6b9ee6c874f9.slice/crio-cccfcf5a631ca954afc8ad9721be9c67cf86fbaf661407f06c544e00200482f8 WatchSource:0}: Error finding container cccfcf5a631ca954afc8ad9721be9c67cf86fbaf661407f06c544e00200482f8: Status 404 returned error can't find the container with id cccfcf5a631ca954afc8ad9721be9c67cf86fbaf661407f06c544e00200482f8 Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.611583 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g"] Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.991344 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc"] Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.028102 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.037654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.189253 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-6dk85" Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.198480 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.349506 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" event={"ID":"2a379e8a-c5df-465e-8b23-6b9ee6c874f9","Type":"ContainerStarted","Data":"cccfcf5a631ca954afc8ad9721be9c67cf86fbaf661407f06c544e00200482f8"} Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.350846 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" event={"ID":"7e0beaae-8f5c-4504-9d2a-1b32980e4f37","Type":"ContainerStarted","Data":"a447f8d3411eddfa0689753483afedf30e250b7b0852eb65564f49646faffe37"} Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.395461 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f"] Feb 26 20:11:21 crc kubenswrapper[4722]: W0226 20:11:21.397852 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d97484_b285_458e_94f4_3bd8700a25d7.slice/crio-493f9b18a981b10014093c921f7131b010dd62a57bf4037a4968d4162db5244a WatchSource:0}: Error finding container 493f9b18a981b10014093c921f7131b010dd62a57bf4037a4968d4162db5244a: Status 404 returned error can't find the container with id 493f9b18a981b10014093c921f7131b010dd62a57bf4037a4968d4162db5244a Feb 26 20:11:22 crc kubenswrapper[4722]: I0226 20:11:22.387717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" event={"ID":"c7d97484-b285-458e-94f4-3bd8700a25d7","Type":"ContainerStarted","Data":"493f9b18a981b10014093c921f7131b010dd62a57bf4037a4968d4162db5244a"} Feb 26 20:11:23 crc kubenswrapper[4722]: I0226 20:11:23.397746 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" event={"ID":"c7d97484-b285-458e-94f4-3bd8700a25d7","Type":"ContainerStarted","Data":"4854a9f423d470ebb436c39288c35f9194752848ed65b46a15f897477f45f2d0"} Feb 26 20:11:23 crc kubenswrapper[4722]: I0226 20:11:23.397899 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:23 crc kubenswrapper[4722]: I0226 20:11:23.427990 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" podStartSLOduration=35.427970526 podStartE2EDuration="35.427970526s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:11:23.425705485 +0000 UTC m=+1025.962673429" watchObservedRunningTime="2026-02-26 20:11:23.427970526 +0000 UTC m=+1025.964938450" Feb 26 20:11:26 crc kubenswrapper[4722]: I0226 20:11:26.426814 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" event={"ID":"b96ea9ca-8ca1-41aa-af25-a184c79bf18f","Type":"ContainerStarted","Data":"d22c7a17a609dd874b8c84ae4a21da69a00f16817ea2e227ca17e1cf4b6690b4"} Feb 26 20:11:26 crc kubenswrapper[4722]: I0226 20:11:26.427719 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:11:26 crc kubenswrapper[4722]: I0226 20:11:26.429858 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" event={"ID":"6bc05a1e-4ace-47bc-af66-42c44dc19b80","Type":"ContainerStarted","Data":"a58f0bbbcd885404fc468c62bf71c0732d071e62e3f922a87de9edfc310a51e5"} Feb 26 20:11:26 crc kubenswrapper[4722]: I0226 20:11:26.430078 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:11:26 crc kubenswrapper[4722]: I0226 20:11:26.458239 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" podStartSLOduration=2.91768974 podStartE2EDuration="38.458223666s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.690015429 +0000 UTC m=+992.226983353" lastFinishedPulling="2026-02-26 20:11:25.230549335 +0000 UTC m=+1027.767517279" observedRunningTime="2026-02-26 20:11:26.440285686 +0000 UTC m=+1028.977253610" watchObservedRunningTime="2026-02-26 20:11:26.458223666 +0000 UTC m=+1028.995191590" Feb 26 20:11:27 crc kubenswrapper[4722]: I0226 20:11:27.438167 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" event={"ID":"7e0beaae-8f5c-4504-9d2a-1b32980e4f37","Type":"ContainerStarted","Data":"ce6593d01208b6488ce928c9b4c03490912bd019a25f8daef88ac40e1a0270ae"} Feb 26 20:11:27 crc kubenswrapper[4722]: I0226 20:11:27.439215 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:11:27 crc kubenswrapper[4722]: I0226 20:11:27.441719 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" event={"ID":"2a379e8a-c5df-465e-8b23-6b9ee6c874f9","Type":"ContainerStarted","Data":"ccf2ff52308199c3f33f47b16c37b96f7b61d7153bc65e24b96739a8e026e008"} Feb 26 20:11:27 crc kubenswrapper[4722]: I0226 20:11:27.441755 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:11:27 crc kubenswrapper[4722]: I0226 20:11:27.469218 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" podStartSLOduration=4.042476524 podStartE2EDuration="39.469200553s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.806578181 +0000 UTC m=+992.343546105" lastFinishedPulling="2026-02-26 20:11:25.2333022 +0000 UTC m=+1027.770270134" observedRunningTime="2026-02-26 20:11:26.468324892 +0000 UTC m=+1029.005292846" watchObservedRunningTime="2026-02-26 20:11:27.469200553 +0000 UTC m=+1030.006168477" Feb 26 20:11:27 crc kubenswrapper[4722]: I0226 20:11:27.470728 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" podStartSLOduration=33.486056156 podStartE2EDuration="39.470716405s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:11:21.00211572 +0000 UTC m=+1023.539083644" lastFinishedPulling="2026-02-26 20:11:26.986775969 +0000 UTC m=+1029.523743893" observedRunningTime="2026-02-26 20:11:27.463106957 +0000 UTC m=+1030.000074901" watchObservedRunningTime="2026-02-26 20:11:27.470716405 +0000 UTC m=+1030.007684329" Feb 26 20:11:31 crc kubenswrapper[4722]: I0226 20:11:31.206190 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:31 crc kubenswrapper[4722]: I0226 20:11:31.234708 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" podStartSLOduration=36.865981817 podStartE2EDuration="43.234683015s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:11:20.61439506 +0000 UTC m=+1023.151362994" lastFinishedPulling="2026-02-26 20:11:26.983096268 +0000 UTC m=+1029.520064192" observedRunningTime="2026-02-26 20:11:27.490624808 +0000 UTC m=+1030.027592752" watchObservedRunningTime="2026-02-26 20:11:31.234683015 +0000 UTC m=+1033.771650969" Feb 26 20:11:38 crc kubenswrapper[4722]: I0226 20:11:38.635819 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:11:38 crc kubenswrapper[4722]: I0226 20:11:38.684868 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:11:40 crc kubenswrapper[4722]: I0226 20:11:40.338889 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:11:40 crc kubenswrapper[4722]: I0226 20:11:40.569097 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.746453 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fd7cr"] Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.749378 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.758814 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.758913 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.759209 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8c4mk" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.759326 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.760358 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fd7cr"] Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.792019 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-swdrv"] Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.793432 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.795875 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.802280 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-swdrv"] Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.867226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-config\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.867279 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6pbk\" (UniqueName: \"kubernetes.io/projected/51cda6ae-4351-4bcb-b533-54a4103a10a0-kube-api-access-d6pbk\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.867311 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-config\") pod \"dnsmasq-dns-675f4bcbfc-fd7cr\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.867538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl45n\" (UniqueName: \"kubernetes.io/projected/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-kube-api-access-vl45n\") pod \"dnsmasq-dns-675f4bcbfc-fd7cr\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.867642 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.968928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-config\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.968988 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6pbk\" (UniqueName: \"kubernetes.io/projected/51cda6ae-4351-4bcb-b533-54a4103a10a0-kube-api-access-d6pbk\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.969025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-config\") pod \"dnsmasq-dns-675f4bcbfc-fd7cr\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.969101 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl45n\" (UniqueName: \"kubernetes.io/projected/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-kube-api-access-vl45n\") pod \"dnsmasq-dns-675f4bcbfc-fd7cr\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.969173 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.970222 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-config\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.970270 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.970485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-config\") pod \"dnsmasq-dns-675f4bcbfc-fd7cr\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.989114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl45n\" (UniqueName: \"kubernetes.io/projected/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-kube-api-access-vl45n\") pod \"dnsmasq-dns-675f4bcbfc-fd7cr\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.990101 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6pbk\" (UniqueName: \"kubernetes.io/projected/51cda6ae-4351-4bcb-b533-54a4103a10a0-kube-api-access-d6pbk\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:57 crc kubenswrapper[4722]: I0226 20:11:57.119684 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:57 crc kubenswrapper[4722]: I0226 20:11:57.130423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:57 crc kubenswrapper[4722]: I0226 20:11:57.386426 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fd7cr"] Feb 26 20:11:57 crc kubenswrapper[4722]: I0226 20:11:57.461861 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-swdrv"] Feb 26 20:11:57 crc kubenswrapper[4722]: W0226 20:11:57.464291 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51cda6ae_4351_4bcb_b533_54a4103a10a0.slice/crio-ed2efd17baf15e542a5a8d9ce4d0dc74d9e68b0df2ec986b39768a63db3984ba WatchSource:0}: Error finding container ed2efd17baf15e542a5a8d9ce4d0dc74d9e68b0df2ec986b39768a63db3984ba: Status 404 returned error can't find the container with id ed2efd17baf15e542a5a8d9ce4d0dc74d9e68b0df2ec986b39768a63db3984ba Feb 26 20:11:57 crc kubenswrapper[4722]: I0226 20:11:57.685978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" event={"ID":"2995f0a7-c3bd-4a2f-8c24-2982b38076bd","Type":"ContainerStarted","Data":"440c8d47642ac8b0dfb7f85ed0c8feab125f64e8fa816b2aba0668d34dce72b9"} Feb 26 20:11:57 crc kubenswrapper[4722]: I0226 20:11:57.687475 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" event={"ID":"51cda6ae-4351-4bcb-b533-54a4103a10a0","Type":"ContainerStarted","Data":"ed2efd17baf15e542a5a8d9ce4d0dc74d9e68b0df2ec986b39768a63db3984ba"} Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.266556 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fd7cr"] Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.306910 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hmnmf"] Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.309190 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.318998 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prztd\" (UniqueName: \"kubernetes.io/projected/08daf4e8-990e-4891-a06c-53fe8ba611db-kube-api-access-prztd\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.319063 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.319338 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-config\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.322767 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hmnmf"] Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.420695 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prztd\" (UniqueName: \"kubernetes.io/projected/08daf4e8-990e-4891-a06c-53fe8ba611db-kube-api-access-prztd\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.420773 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.420833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-config\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.424264 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.430458 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-config\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.446869 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prztd\" (UniqueName: \"kubernetes.io/projected/08daf4e8-990e-4891-a06c-53fe8ba611db-kube-api-access-prztd\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.600113 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-swdrv"] Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.623028 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8w24m"] Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.624193 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.632003 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tvtz\" (UniqueName: \"kubernetes.io/projected/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-kube-api-access-2tvtz\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.632071 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.632405 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-config\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.635593 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.641098 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8w24m"] Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.733221 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-config\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.733303 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tvtz\" (UniqueName: \"kubernetes.io/projected/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-kube-api-access-2tvtz\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.733334 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.734830 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.734929 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-config\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.756479 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tvtz\" (UniqueName: \"kubernetes.io/projected/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-kube-api-access-2tvtz\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.946409 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.142009 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535612-72dkb"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.142909 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.147235 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.147522 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.147642 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.175383 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535612-72dkb"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.211578 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hmnmf"] Feb 26 20:12:00 crc kubenswrapper[4722]: W0226 20:12:00.212756 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08daf4e8_990e_4891_a06c_53fe8ba611db.slice/crio-7f48f332ecd5d25ff3d13cf2f281cbb54ae5fac16d6d46f482b61e9d73db0276 WatchSource:0}: Error finding container 7f48f332ecd5d25ff3d13cf2f281cbb54ae5fac16d6d46f482b61e9d73db0276: Status 404 returned error can't find the container with id 7f48f332ecd5d25ff3d13cf2f281cbb54ae5fac16d6d46f482b61e9d73db0276 Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.245353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wrn2\" (UniqueName: \"kubernetes.io/projected/310eccc9-804e-4a2c-ba45-adf425f191ba-kube-api-access-8wrn2\") pod \"auto-csr-approver-29535612-72dkb\" (UID: \"310eccc9-804e-4a2c-ba45-adf425f191ba\") " pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.346043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wrn2\" (UniqueName: \"kubernetes.io/projected/310eccc9-804e-4a2c-ba45-adf425f191ba-kube-api-access-8wrn2\") pod \"auto-csr-approver-29535612-72dkb\" (UID: \"310eccc9-804e-4a2c-ba45-adf425f191ba\") " pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.364311 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wrn2\" (UniqueName: \"kubernetes.io/projected/310eccc9-804e-4a2c-ba45-adf425f191ba-kube-api-access-8wrn2\") pod \"auto-csr-approver-29535612-72dkb\" (UID: \"310eccc9-804e-4a2c-ba45-adf425f191ba\") " pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.442345 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.471918 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.474329 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: W0226 20:12:00.480116 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f8e8bf9_7dbe_4c58_80bf_f0c273fd4df8.slice/crio-b868ba8dbefb0b180ae497fd7f691623d3a5f11135fcab3eb7559a8b9d396d3d WatchSource:0}: Error finding container b868ba8dbefb0b180ae497fd7f691623d3a5f11135fcab3eb7559a8b9d396d3d: Status 404 returned error can't find the container with id b868ba8dbefb0b180ae497fd7f691623d3a5f11135fcab3eb7559a8b9d396d3d Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.492928 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8w24m"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.495211 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.495990 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.497072 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.497112 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.497240 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.497581 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.498618 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dspkw" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.504821 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686215 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686276 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a913d767-5243-448d-b5e9-6112a27b6233-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686343 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686366 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a913d767-5243-448d-b5e9-6112a27b6233-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686412 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9xwm\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-kube-api-access-h9xwm\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686433 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686483 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686521 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686541 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-config-data\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.719563 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" event={"ID":"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8","Type":"ContainerStarted","Data":"b868ba8dbefb0b180ae497fd7f691623d3a5f11135fcab3eb7559a8b9d396d3d"} Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.722506 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" event={"ID":"08daf4e8-990e-4891-a06c-53fe8ba611db","Type":"ContainerStarted","Data":"7f48f332ecd5d25ff3d13cf2f281cbb54ae5fac16d6d46f482b61e9d73db0276"} Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.747708 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.749465 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758330 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758492 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758577 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758658 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758749 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758789 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mrr5c" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758799 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.774475 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789123 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-config-data\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789174 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789201 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a913d767-5243-448d-b5e9-6112a27b6233-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789232 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789256 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789275 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789295 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a913d767-5243-448d-b5e9-6112a27b6233-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9xwm\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-kube-api-access-h9xwm\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789330 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.790198 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.790410 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.792218 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.792638 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-config-data\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.793180 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.793754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.796200 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.796236 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd14e19774d71f109a19171e3fc1d26ffc39fb374e187e66a1dc69515e8b6e48/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.800222 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a913d767-5243-448d-b5e9-6112a27b6233-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.802596 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a913d767-5243-448d-b5e9-6112a27b6233-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.807690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9xwm\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-kube-api-access-h9xwm\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.808820 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.832200 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.854392 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891487 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891530 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b02241f-513e-4558-b519-5bd84e5b4eff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891554 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891579 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891656 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891684 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891720 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891751 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llfpl\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-kube-api-access-llfpl\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891786 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b02241f-513e-4558-b519-5bd84e5b4eff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.961789 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535612-72dkb"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.995831 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llfpl\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-kube-api-access-llfpl\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.995901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b02241f-513e-4558-b519-5bd84e5b4eff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.995925 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.995945 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.995970 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b02241f-513e-4558-b519-5bd84e5b4eff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.996002 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.996025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.996055 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.996093 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.996121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.996189 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.997754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.998705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.999814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.000197 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.000644 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.008758 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b02241f-513e-4558-b519-5bd84e5b4eff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.022036 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.022102 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fcc038ee7f96188050e1013bbe01ce8f5883fc8f59481375757326e8cc4a362e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.025892 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llfpl\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-kube-api-access-llfpl\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.026970 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.027616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.028098 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b02241f-513e-4558-b519-5bd84e5b4eff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: W0226 20:12:01.093414 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod310eccc9_804e_4a2c_ba45_adf425f191ba.slice/crio-932a895f64d61b01ebb9d0d936f837191496199fe815f45f2d5fbad368f7541b WatchSource:0}: Error finding container 932a895f64d61b01ebb9d0d936f837191496199fe815f45f2d5fbad368f7541b: Status 404 returned error can't find the container with id 932a895f64d61b01ebb9d0d936f837191496199fe815f45f2d5fbad368f7541b Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.144931 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.166626 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.559733 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:12:01 crc kubenswrapper[4722]: W0226 20:12:01.562466 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda913d767_5243_448d_b5e9_6112a27b6233.slice/crio-5221b51bb2dbaccdd7dd4d846badf69d17780392d0617953086df303d0ca64d3 WatchSource:0}: Error finding container 5221b51bb2dbaccdd7dd4d846badf69d17780392d0617953086df303d0ca64d3: Status 404 returned error can't find the container with id 5221b51bb2dbaccdd7dd4d846badf69d17780392d0617953086df303d0ca64d3 Feb 26 20:12:01 crc kubenswrapper[4722]: W0226 20:12:01.699240 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b02241f_513e_4558_b519_5bd84e5b4eff.slice/crio-6a2fa971868db01bdd112f4eb1e3489ba1f5fe053897c45af297698ae106632b WatchSource:0}: Error finding container 6a2fa971868db01bdd112f4eb1e3489ba1f5fe053897c45af297698ae106632b: Status 404 returned error can't find the container with id 6a2fa971868db01bdd112f4eb1e3489ba1f5fe053897c45af297698ae106632b Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.704603 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.740090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535612-72dkb" event={"ID":"310eccc9-804e-4a2c-ba45-adf425f191ba","Type":"ContainerStarted","Data":"932a895f64d61b01ebb9d0d936f837191496199fe815f45f2d5fbad368f7541b"} Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.741672 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b02241f-513e-4558-b519-5bd84e5b4eff","Type":"ContainerStarted","Data":"6a2fa971868db01bdd112f4eb1e3489ba1f5fe053897c45af297698ae106632b"} Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.743756 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a913d767-5243-448d-b5e9-6112a27b6233","Type":"ContainerStarted","Data":"5221b51bb2dbaccdd7dd4d846badf69d17780392d0617953086df303d0ca64d3"} Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.183671 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.185016 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.185103 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.188119 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.199056 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-m9ttw" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.201988 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.204016 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.220036 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.334974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhqhb\" (UniqueName: \"kubernetes.io/projected/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-kube-api-access-qhqhb\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335037 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-config-data-default\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-kolla-config\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335109 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335130 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-67553082-12e7-4960-a518-5fddade9296f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67553082-12e7-4960-a518-5fddade9296f\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335255 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.437026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.437074 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-kolla-config\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.437092 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.437117 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-67553082-12e7-4960-a518-5fddade9296f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67553082-12e7-4960-a518-5fddade9296f\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.437329 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.438103 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.438018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-kolla-config\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.438192 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhqhb\" (UniqueName: \"kubernetes.io/projected/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-kube-api-access-qhqhb\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.438240 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-config-data-default\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.438410 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.439844 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.440180 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-config-data-default\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.443483 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.443512 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-67553082-12e7-4960-a518-5fddade9296f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67553082-12e7-4960-a518-5fddade9296f\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f5e9bb0f97983b6b578f58d217b44aa53456ebdba0137b94153a8f6fb23b752c/globalmount\"" pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.458803 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.464735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.472405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhqhb\" (UniqueName: \"kubernetes.io/projected/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-kube-api-access-qhqhb\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.504087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-67553082-12e7-4960-a518-5fddade9296f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67553082-12e7-4960-a518-5fddade9296f\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.527794 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.347281 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.350254 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.355797 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.355934 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2wnm6" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.356327 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.356663 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.365607 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461380 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461463 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/12264086-b848-4375-9787-a2ff33b411f0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461492 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461566 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9kq4\" (UniqueName: \"kubernetes.io/projected/12264086-b848-4375-9787-a2ff33b411f0-kube-api-access-w9kq4\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-583a012e-3a28-4de3-94b4-dc9e22224298\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-583a012e-3a28-4de3-94b4-dc9e22224298\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/12264086-b848-4375-9787-a2ff33b411f0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461807 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12264086-b848-4375-9787-a2ff33b411f0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.563784 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9kq4\" (UniqueName: \"kubernetes.io/projected/12264086-b848-4375-9787-a2ff33b411f0-kube-api-access-w9kq4\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.563929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-583a012e-3a28-4de3-94b4-dc9e22224298\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-583a012e-3a28-4de3-94b4-dc9e22224298\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.563967 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/12264086-b848-4375-9787-a2ff33b411f0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.564008 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.564032 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12264086-b848-4375-9787-a2ff33b411f0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.564064 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.564106 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/12264086-b848-4375-9787-a2ff33b411f0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.564188 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.565256 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/12264086-b848-4375-9787-a2ff33b411f0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.565418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.566106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.566080 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.566222 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-583a012e-3a28-4de3-94b4-dc9e22224298\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-583a012e-3a28-4de3-94b4-dc9e22224298\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6dd142a7bc9d6c8172e43170abafeabfe06bda4ee7515d6cd584e1e879a9e9ee/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.567636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.570999 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/12264086-b848-4375-9787-a2ff33b411f0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.571390 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12264086-b848-4375-9787-a2ff33b411f0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.584600 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9kq4\" (UniqueName: \"kubernetes.io/projected/12264086-b848-4375-9787-a2ff33b411f0-kube-api-access-w9kq4\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.615000 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-583a012e-3a28-4de3-94b4-dc9e22224298\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-583a012e-3a28-4de3-94b4-dc9e22224298\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.683799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.700313 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.703344 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.708625 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-7g6xq" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.708813 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.708836 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.716072 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.766827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a4edaeb-4029-4586-ab06-d09489d2e944-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.766960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj85x\" (UniqueName: \"kubernetes.io/projected/0a4edaeb-4029-4586-ab06-d09489d2e944-kube-api-access-zj85x\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.767005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4edaeb-4029-4586-ab06-d09489d2e944-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.767054 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0a4edaeb-4029-4586-ab06-d09489d2e944-kolla-config\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.767084 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a4edaeb-4029-4586-ab06-d09489d2e944-config-data\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.868927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a4edaeb-4029-4586-ab06-d09489d2e944-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.868990 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj85x\" (UniqueName: \"kubernetes.io/projected/0a4edaeb-4029-4586-ab06-d09489d2e944-kube-api-access-zj85x\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.869049 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4edaeb-4029-4586-ab06-d09489d2e944-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.869088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0a4edaeb-4029-4586-ab06-d09489d2e944-kolla-config\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.869120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a4edaeb-4029-4586-ab06-d09489d2e944-config-data\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.870003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0a4edaeb-4029-4586-ab06-d09489d2e944-kolla-config\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.870071 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a4edaeb-4029-4586-ab06-d09489d2e944-config-data\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.876849 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4edaeb-4029-4586-ab06-d09489d2e944-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.878038 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a4edaeb-4029-4586-ab06-d09489d2e944-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.887782 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj85x\" (UniqueName: \"kubernetes.io/projected/0a4edaeb-4029-4586-ab06-d09489d2e944-kube-api-access-zj85x\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:04 crc kubenswrapper[4722]: I0226 20:12:04.044122 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.740836 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.742550 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.745010 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8tslb" Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.751097 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.811202 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnwp8\" (UniqueName: \"kubernetes.io/projected/e6617222-c81a-46cc-9c98-1170f7c89846-kube-api-access-rnwp8\") pod \"kube-state-metrics-0\" (UID: \"e6617222-c81a-46cc-9c98-1170f7c89846\") " pod="openstack/kube-state-metrics-0" Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.912880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnwp8\" (UniqueName: \"kubernetes.io/projected/e6617222-c81a-46cc-9c98-1170f7c89846-kube-api-access-rnwp8\") pod \"kube-state-metrics-0\" (UID: \"e6617222-c81a-46cc-9c98-1170f7c89846\") " pod="openstack/kube-state-metrics-0" Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.946673 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnwp8\" (UniqueName: \"kubernetes.io/projected/e6617222-c81a-46cc-9c98-1170f7c89846-kube-api-access-rnwp8\") pod \"kube-state-metrics-0\" (UID: \"e6617222-c81a-46cc-9c98-1170f7c89846\") " pod="openstack/kube-state-metrics-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.080507 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.507957 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.510030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.518566 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.518946 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.519253 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.519612 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.519903 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-dzf55" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.563790 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627374 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627454 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627491 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcqpq\" (UniqueName: \"kubernetes.io/projected/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-kube-api-access-rcqpq\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627526 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627558 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627614 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627655 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.728886 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.728957 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.729018 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.729051 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcqpq\" (UniqueName: \"kubernetes.io/projected/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-kube-api-access-rcqpq\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.729083 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.729112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.729186 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.729991 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.734335 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.734699 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.734851 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.750657 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.752873 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcqpq\" (UniqueName: \"kubernetes.io/projected/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-kube-api-access-rcqpq\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.755974 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.853640 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.043801 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.045982 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.049020 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.049502 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.049694 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.049920 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.050205 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-z8rrv" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.050436 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.050742 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.051169 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.057846 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.137650 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.137967 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138088 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-config\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138217 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138308 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138415 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94e2a737-a422-4ef4-9394-324953ef1ff2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138530 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjgf9\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-kube-api-access-cjgf9\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138619 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.240738 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.240825 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.240851 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-config\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.240888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.240921 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.240969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94e2a737-a422-4ef4-9394-324953ef1ff2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.241012 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjgf9\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-kube-api-access-cjgf9\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.241046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.241152 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.241179 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.242772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.243522 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.244027 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.245680 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.247020 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.247244 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-config\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.247538 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.247828 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94e2a737-a422-4ef4-9394-324953ef1ff2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.248946 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.248975 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2afc96fa7f9c378e63298d168f739061cadeeb81c2b7504ca3dad6d4afb5d2c4/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.271336 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjgf9\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-kube-api-access-cjgf9\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.281880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.408413 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.534703 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rsgbx"] Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.535839 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.537917 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.537997 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-fqbth" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.543323 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.560164 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rsgbx"] Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.584742 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-run\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.584808 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9c23c8-6fed-49f5-abe1-d44b885952ec-ovn-controller-tls-certs\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.584845 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wrlq\" (UniqueName: \"kubernetes.io/projected/5c9c23c8-6fed-49f5-abe1-d44b885952ec-kube-api-access-9wrlq\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.584870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-log-ovn\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.584960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9c23c8-6fed-49f5-abe1-d44b885952ec-scripts\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.584984 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9c23c8-6fed-49f5-abe1-d44b885952ec-combined-ca-bundle\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.585006 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-run-ovn\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.615751 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-k7h8c"] Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.617458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.641933 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k7h8c"] Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688500 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-lib\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688562 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-run\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688595 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9c23c8-6fed-49f5-abe1-d44b885952ec-ovn-controller-tls-certs\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688653 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wrlq\" (UniqueName: \"kubernetes.io/projected/5c9c23c8-6fed-49f5-abe1-d44b885952ec-kube-api-access-9wrlq\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688676 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-log-ovn\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688716 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-run\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688766 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba0fada1-7131-401e-adf3-f9e05d1bd949-scripts\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688810 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-log\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688842 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9c23c8-6fed-49f5-abe1-d44b885952ec-scripts\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9c23c8-6fed-49f5-abe1-d44b885952ec-combined-ca-bundle\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688893 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-run-ovn\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688978 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-etc-ovs\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.689090 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prskn\" (UniqueName: \"kubernetes.io/projected/ba0fada1-7131-401e-adf3-f9e05d1bd949-kube-api-access-prskn\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.689123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-run\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.689312 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-log-ovn\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.691093 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9c23c8-6fed-49f5-abe1-d44b885952ec-scripts\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.691213 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-run-ovn\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.694440 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9c23c8-6fed-49f5-abe1-d44b885952ec-ovn-controller-tls-certs\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.694531 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9c23c8-6fed-49f5-abe1-d44b885952ec-combined-ca-bundle\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.718792 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wrlq\" (UniqueName: \"kubernetes.io/projected/5c9c23c8-6fed-49f5-abe1-d44b885952ec-kube-api-access-9wrlq\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.790788 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba0fada1-7131-401e-adf3-f9e05d1bd949-scripts\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.790845 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-log\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.790912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-etc-ovs\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.791122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-log\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.790933 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prskn\" (UniqueName: \"kubernetes.io/projected/ba0fada1-7131-401e-adf3-f9e05d1bd949-kube-api-access-prskn\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.791218 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-etc-ovs\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.791230 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-lib\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.791413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-run\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.791603 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-run\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.791632 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-lib\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.793648 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba0fada1-7131-401e-adf3-f9e05d1bd949-scripts\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.806820 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prskn\" (UniqueName: \"kubernetes.io/projected/ba0fada1-7131-401e-adf3-f9e05d1bd949-kube-api-access-prskn\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.856346 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.930589 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.010794 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.014605 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.017381 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6p8gv" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.017551 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.017761 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.018050 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.020778 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.030840 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4601fbad-d1bf-4205-86c5-a392e381300e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097170 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4601fbad-d1bf-4205-86c5-a392e381300e-config\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097221 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4601fbad-d1bf-4205-86c5-a392e381300e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097260 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097298 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097316 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6kcn\" (UniqueName: \"kubernetes.io/projected/4601fbad-d1bf-4205-86c5-a392e381300e-kube-api-access-j6kcn\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097584 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.198955 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199003 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199022 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6kcn\" (UniqueName: \"kubernetes.io/projected/4601fbad-d1bf-4205-86c5-a392e381300e-kube-api-access-j6kcn\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199187 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4601fbad-d1bf-4205-86c5-a392e381300e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199219 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4601fbad-d1bf-4205-86c5-a392e381300e-config\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199248 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4601fbad-d1bf-4205-86c5-a392e381300e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199860 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4601fbad-d1bf-4205-86c5-a392e381300e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.200651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4601fbad-d1bf-4205-86c5-a392e381300e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.201169 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4601fbad-d1bf-4205-86c5-a392e381300e-config\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.206303 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.206585 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.210731 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.217799 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.218427 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1199dd216ba16fd5bc6d34afccac5ed7560d943453b769e8a2ea9686fc16e58f/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.220722 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6kcn\" (UniqueName: \"kubernetes.io/projected/4601fbad-d1bf-4205-86c5-a392e381300e-kube-api-access-j6kcn\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.266055 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.345990 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.336671 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.338735 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.340769 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.341303 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.341313 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2d9nn" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.341329 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.355301 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376079 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376164 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-71837b70-1d3d-43e1-b867-b564083622c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-71837b70-1d3d-43e1-b867-b564083622c9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-config\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk7w9\" (UniqueName: \"kubernetes.io/projected/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-kube-api-access-jk7w9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376267 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376289 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376321 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376385 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.477452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-71837b70-1d3d-43e1-b867-b564083622c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-71837b70-1d3d-43e1-b867-b564083622c9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.477508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-config\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.477532 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk7w9\" (UniqueName: \"kubernetes.io/projected/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-kube-api-access-jk7w9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.477576 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.477651 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.477689 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.478107 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.478360 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.478448 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-config\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.479309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.479969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.488042 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.488455 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.488502 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-71837b70-1d3d-43e1-b867-b564083622c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-71837b70-1d3d-43e1-b867-b564083622c9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/174397e58640cb911b34ca7f2c6a5a216c90b035a313626b2e5658dfaa3fbc88/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.488553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.491815 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.494262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk7w9\" (UniqueName: \"kubernetes.io/projected/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-kube-api-access-jk7w9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.520491 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-71837b70-1d3d-43e1-b867-b564083622c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-71837b70-1d3d-43e1-b867-b564083622c9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.658192 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.433587 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.434912 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.437560 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-7xtxm" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.437748 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.437822 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.437755 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.439853 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.455420 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.497551 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.497612 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.498019 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.498111 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.498164 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmrrm\" (UniqueName: \"kubernetes.io/projected/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-kube-api-access-kmrrm\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.591894 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.593863 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.597622 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.598056 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.598223 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.599425 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.599483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.599512 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmrrm\" (UniqueName: \"kubernetes.io/projected/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-kube-api-access-kmrrm\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.599555 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.599583 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.600527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.601555 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.604357 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.605740 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.635789 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.644067 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmrrm\" (UniqueName: \"kubernetes.io/projected/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-kube-api-access-kmrrm\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.704408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.704535 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.704576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.704624 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.704684 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfjp5\" (UniqueName: \"kubernetes.io/projected/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-kube-api-access-cfjp5\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.704781 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.765565 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.790333 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.791678 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.802889 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.803114 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.806490 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.806560 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.806596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.806637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.806669 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.806727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfjp5\" (UniqueName: \"kubernetes.io/projected/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-kube-api-access-cfjp5\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.809781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.818494 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.821735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.836705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfjp5\" (UniqueName: \"kubernetes.io/projected/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-kube-api-access-cfjp5\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.838475 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.863007 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.881324 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.910524 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxxcz\" (UniqueName: \"kubernetes.io/projected/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-kube-api-access-rxxcz\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.910573 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.910661 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.910715 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.910740 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.945445 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.946505 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.956848 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.956923 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.957014 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.957125 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.957190 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.961890 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.981487 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.982278 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.001172 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.002274 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.004329 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-58ztr" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.010203 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.012197 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.012384 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.012531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.012657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxxcz\" (UniqueName: \"kubernetes.io/projected/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-kube-api-access-rxxcz\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.012762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.014983 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.024514 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.030624 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.031926 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.060235 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxxcz\" (UniqueName: \"kubernetes.io/projected/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-kube-api-access-rxxcz\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.116731 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.116826 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117328 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117482 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkz2j\" (UniqueName: \"kubernetes.io/projected/43abd91c-064b-4440-9bb9-8f9768720659-kube-api-access-bkz2j\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117527 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117603 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117633 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117654 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117677 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117737 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117814 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117887 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117965 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.118005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.118066 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.118126 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.118217 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.118255 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxrxw\" (UniqueName: \"kubernetes.io/projected/23fc144a-bb55-464d-8f21-94038bf68ecd-kube-api-access-qxrxw\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220385 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220438 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220467 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220545 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220580 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220614 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220644 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxrxw\" (UniqueName: \"kubernetes.io/projected/23fc144a-bb55-464d-8f21-94038bf68ecd-kube-api-access-qxrxw\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220718 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220750 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220770 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkz2j\" (UniqueName: \"kubernetes.io/projected/43abd91c-064b-4440-9bb9-8f9768720659-kube-api-access-bkz2j\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220789 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220819 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220838 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220878 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220946 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.221498 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.222375 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.222698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.224124 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.225017 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.225284 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.225674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.226270 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.226427 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.226723 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.227182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.236176 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.236405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.236672 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.237872 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxrxw\" (UniqueName: \"kubernetes.io/projected/23fc144a-bb55-464d-8f21-94038bf68ecd-kube-api-access-qxrxw\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.238484 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.239270 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.242904 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkz2j\" (UniqueName: \"kubernetes.io/projected/43abd91c-064b-4440-9bb9-8f9768720659-kube-api-access-bkz2j\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.301838 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.362042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.588103 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.590671 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.593262 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.593470 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.598727 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.725396 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.726917 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.729095 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.729172 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.730818 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.730883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.730911 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.730989 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/082c8f6a-a03f-4567-891c-56b6aa6f26d3-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.731072 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.731149 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.731216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.731259 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4rmv\" (UniqueName: \"kubernetes.io/projected/082c8f6a-a03f-4567-891c-56b6aa6f26d3-kube-api-access-v4rmv\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.743158 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.839129 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4rmv\" (UniqueName: \"kubernetes.io/projected/082c8f6a-a03f-4567-891c-56b6aa6f26d3-kube-api-access-v4rmv\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.839190 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.839227 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.839256 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.839281 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.839302 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840419 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840486 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840523 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/082c8f6a-a03f-4567-891c-56b6aa6f26d3-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840548 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrz78\" (UniqueName: \"kubernetes.io/projected/a66cb8be-67f7-46f6-90c1-914129608068-kube-api-access-zrz78\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840628 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840768 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840839 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66cb8be-67f7-46f6-90c1-914129608068-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.841212 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.842386 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.844722 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.848521 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.848764 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.849029 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/082c8f6a-a03f-4567-891c-56b6aa6f26d3-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.849434 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.851002 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.855499 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.855797 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.860756 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.861453 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.876177 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4rmv\" (UniqueName: \"kubernetes.io/projected/082c8f6a-a03f-4567-891c-56b6aa6f26d3-kube-api-access-v4rmv\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.900109 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.902755 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.941999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66cb8be-67f7-46f6-90c1-914129608068-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942042 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942108 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942378 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942533 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf6hb\" (UniqueName: \"kubernetes.io/projected/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-kube-api-access-xf6hb\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942558 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942816 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrz78\" (UniqueName: \"kubernetes.io/projected/a66cb8be-67f7-46f6-90c1-914129608068-kube-api-access-zrz78\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942958 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.943065 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.943256 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.943324 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.944067 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66cb8be-67f7-46f6-90c1-914129608068-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.944484 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.951129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.954057 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.956195 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.959323 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrz78\" (UniqueName: \"kubernetes.io/projected/a66cb8be-67f7-46f6-90c1-914129608068-kube-api-access-zrz78\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.966799 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.982797 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044568 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044641 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf6hb\" (UniqueName: \"kubernetes.io/projected/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-kube-api-access-xf6hb\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044764 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044790 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044816 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044835 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044995 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.045841 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.045985 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.048650 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.048747 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.048872 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.059902 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf6hb\" (UniqueName: \"kubernetes.io/projected/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-kube-api-access-xf6hb\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.064181 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.256488 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.862711 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.862845 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d6pbk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-swdrv_openstack(51cda6ae-4351-4bcb-b533-54a4103a10a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.864097 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" podUID="51cda6ae-4351-4bcb-b533-54a4103a10a0" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.889007 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.889176 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prztd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-hmnmf_openstack(08daf4e8-990e-4891-a06c-53fe8ba611db): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.890734 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" podUID="08daf4e8-990e-4891-a06c-53fe8ba611db" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.914341 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.914476 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vl45n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-fd7cr_openstack(2995f0a7-c3bd-4a2f-8c24-2982b38076bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.915996 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" podUID="2995f0a7-c3bd-4a2f-8c24-2982b38076bd" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.982099 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" podUID="08daf4e8-990e-4891-a06c-53fe8ba611db" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.225745 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.229350 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.335548 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6pbk\" (UniqueName: \"kubernetes.io/projected/51cda6ae-4351-4bcb-b533-54a4103a10a0-kube-api-access-d6pbk\") pod \"51cda6ae-4351-4bcb-b533-54a4103a10a0\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.335990 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl45n\" (UniqueName: \"kubernetes.io/projected/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-kube-api-access-vl45n\") pod \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.336038 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-config\") pod \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.336162 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-dns-svc\") pod \"51cda6ae-4351-4bcb-b533-54a4103a10a0\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.336248 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-config\") pod \"51cda6ae-4351-4bcb-b533-54a4103a10a0\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.336825 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-config" (OuterVolumeSpecName: "config") pod "2995f0a7-c3bd-4a2f-8c24-2982b38076bd" (UID: "2995f0a7-c3bd-4a2f-8c24-2982b38076bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.337680 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.338856 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "51cda6ae-4351-4bcb-b533-54a4103a10a0" (UID: "51cda6ae-4351-4bcb-b533-54a4103a10a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.339528 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-config" (OuterVolumeSpecName: "config") pod "51cda6ae-4351-4bcb-b533-54a4103a10a0" (UID: "51cda6ae-4351-4bcb-b533-54a4103a10a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.342588 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51cda6ae-4351-4bcb-b533-54a4103a10a0-kube-api-access-d6pbk" (OuterVolumeSpecName: "kube-api-access-d6pbk") pod "51cda6ae-4351-4bcb-b533-54a4103a10a0" (UID: "51cda6ae-4351-4bcb-b533-54a4103a10a0"). InnerVolumeSpecName "kube-api-access-d6pbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.348946 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-kube-api-access-vl45n" (OuterVolumeSpecName: "kube-api-access-vl45n") pod "2995f0a7-c3bd-4a2f-8c24-2982b38076bd" (UID: "2995f0a7-c3bd-4a2f-8c24-2982b38076bd"). InnerVolumeSpecName "kube-api-access-vl45n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.438822 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.438938 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.439027 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6pbk\" (UniqueName: \"kubernetes.io/projected/51cda6ae-4351-4bcb-b533-54a4103a10a0-kube-api-access-d6pbk\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.439102 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl45n\" (UniqueName: \"kubernetes.io/projected/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-kube-api-access-vl45n\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.452646 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.877800 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.895727 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 26 20:12:22 crc kubenswrapper[4722]: W0226 20:12:22.904670 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a4edaeb_4029_4586_ab06_d09489d2e944.slice/crio-639160e494a1dba46949bcad9fea0ce6ca18cfe29dbaab0d65a82e8c43392abd WatchSource:0}: Error finding container 639160e494a1dba46949bcad9fea0ce6ca18cfe29dbaab0d65a82e8c43392abd: Status 404 returned error can't find the container with id 639160e494a1dba46949bcad9fea0ce6ca18cfe29dbaab0d65a82e8c43392abd Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.993580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535612-72dkb" event={"ID":"310eccc9-804e-4a2c-ba45-adf425f191ba","Type":"ContainerStarted","Data":"7e96ceda765a495699c2b1fe964ea6c48cdbb571cfeda308f4b0e0bb2d151a87"} Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.995517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0a4edaeb-4029-4586-ab06-d09489d2e944","Type":"ContainerStarted","Data":"639160e494a1dba46949bcad9fea0ce6ca18cfe29dbaab0d65a82e8c43392abd"} Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.997309 4722 generic.go:334] "Generic (PLEG): container finished" podID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerID="c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8" exitCode=0 Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.997527 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" event={"ID":"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8","Type":"ContainerDied","Data":"c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8"} Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.998772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ffecd786-4ba4-4d40-9b0a-aa0af47577ad","Type":"ContainerStarted","Data":"74f47f240fb04ce704ba754614284d4fdebc067a959d08d7f59dd26603722edc"} Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.999948 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" event={"ID":"51cda6ae-4351-4bcb-b533-54a4103a10a0","Type":"ContainerDied","Data":"ed2efd17baf15e542a5a8d9ce4d0dc74d9e68b0df2ec986b39768a63db3984ba"} Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.000085 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.000904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" event={"ID":"2995f0a7-c3bd-4a2f-8c24-2982b38076bd","Type":"ContainerDied","Data":"440c8d47642ac8b0dfb7f85ed0c8feab125f64e8fa816b2aba0668d34dce72b9"} Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.000927 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.012985 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535612-72dkb" podStartSLOduration=1.868428113 podStartE2EDuration="23.012968762s" podCreationTimestamp="2026-02-26 20:12:00 +0000 UTC" firstStartedPulling="2026-02-26 20:12:01.100631112 +0000 UTC m=+1063.637599026" lastFinishedPulling="2026-02-26 20:12:22.245171751 +0000 UTC m=+1084.782139675" observedRunningTime="2026-02-26 20:12:23.004225074 +0000 UTC m=+1085.541193008" watchObservedRunningTime="2026-02-26 20:12:23.012968762 +0000 UTC m=+1085.549936686" Feb 26 20:12:23 crc kubenswrapper[4722]: W0226 20:12:23.114428 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36cd9a41_f8ca_49e8_b8ad_00dcdd80aff7.slice/crio-ffd1395f65f1184da68fac5dbc73bd28dcd07dbb83c0b42e307b64b58f9c1efd WatchSource:0}: Error finding container ffd1395f65f1184da68fac5dbc73bd28dcd07dbb83c0b42e307b64b58f9c1efd: Status 404 returned error can't find the container with id ffd1395f65f1184da68fac5dbc73bd28dcd07dbb83c0b42e307b64b58f9c1efd Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.290912 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-swdrv"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.299449 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-swdrv"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.325572 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fd7cr"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.334091 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fd7cr"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.545285 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.557980 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.566237 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.634560 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rsgbx"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.664199 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.671061 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: W0226 20:12:23.707788 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod734bb9a8_948b_4d5a_bdb1_df37ad791e6b.slice/crio-bec499ed4856cc4287e3a16047efa6834a770313cebb9c0e3a684aba9c563022 WatchSource:0}: Error finding container bec499ed4856cc4287e3a16047efa6834a770313cebb9c0e3a684aba9c563022: Status 404 returned error can't find the container with id bec499ed4856cc4287e3a16047efa6834a770313cebb9c0e3a684aba9c563022 Feb 26 20:12:23 crc kubenswrapper[4722]: W0226 20:12:23.741538 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43abd91c_064b_4440_9bb9_8f9768720659.slice/crio-3fb2ce0645994dcdcefe5aa63b0681e1451262bc45b2ae025a98b4c768819498 WatchSource:0}: Error finding container 3fb2ce0645994dcdcefe5aa63b0681e1451262bc45b2ae025a98b4c768819498: Status 404 returned error can't find the container with id 3fb2ce0645994dcdcefe5aa63b0681e1451262bc45b2ae025a98b4c768819498 Feb 26 20:12:23 crc kubenswrapper[4722]: W0226 20:12:23.755650 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94e2a737_a422_4ef4_9394_324953ef1ff2.slice/crio-91652176d6f022428384f101195d66eabb6874b54c5593eced205de3eaa53d04 WatchSource:0}: Error finding container 91652176d6f022428384f101195d66eabb6874b54c5593eced205de3eaa53d04: Status 404 returned error can't find the container with id 91652176d6f022428384f101195d66eabb6874b54c5593eced205de3eaa53d04 Feb 26 20:12:23 crc kubenswrapper[4722]: W0226 20:12:23.767435 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12264086_b848_4375_9787_a2ff33b411f0.slice/crio-a76494c3a583e16e62320c8120fa7920942c6104515fedfa71382543cd12867e WatchSource:0}: Error finding container a76494c3a583e16e62320c8120fa7920942c6104515fedfa71382543cd12867e: Status 404 returned error can't find the container with id a76494c3a583e16e62320c8120fa7920942c6104515fedfa71382543cd12867e Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.768780 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.798050 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.814390 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.823927 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.830859 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv"] Feb 26 20:12:23 crc kubenswrapper[4722]: E0226 20:12:23.834589 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-compactor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=compactor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zrz78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-compactor-0_openstack(a66cb8be-67f7-46f6-90c1-914129608068): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 20:12:23 crc kubenswrapper[4722]: E0226 20:12:23.836449 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="a66cb8be-67f7-46f6-90c1-914129608068" Feb 26 20:12:23 crc kubenswrapper[4722]: E0226 20:12:23.838025 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-distributor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=distributor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-distributor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-distributor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kmrrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-distributor-585d9bcbc-w5dgv_openstack(b1e5ce93-d4cd-4ef0-a71b-f63165e558cb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.838766 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: E0226 20:12:23.839474 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" podUID="b1e5ce93-d4cd-4ef0-a71b-f63165e558cb" Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.929094 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: W0226 20:12:23.933255 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fdc8f7b_ae7f_41c5_b31b_c5eac16edebe.slice/crio-be6d3551c4120e7a208a6b86eb03602c02f3476df9eec5e153ebe03dfeee3fc9 WatchSource:0}: Error finding container be6d3551c4120e7a208a6b86eb03602c02f3476df9eec5e153ebe03dfeee3fc9: Status 404 returned error can't find the container with id be6d3551c4120e7a208a6b86eb03602c02f3476df9eec5e153ebe03dfeee3fc9 Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.010001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" event={"ID":"734bb9a8-948b-4d5a-bdb1-df37ad791e6b","Type":"ContainerStarted","Data":"bec499ed4856cc4287e3a16047efa6834a770313cebb9c0e3a684aba9c563022"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.013014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12","Type":"ContainerStarted","Data":"186b1b8bc7108e6d16fcc97b993508b8cdfb7c380b5f2673f1ec941686f73309"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.015155 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" event={"ID":"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8","Type":"ContainerStarted","Data":"455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.015337 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.016248 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" event={"ID":"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb","Type":"ContainerStarted","Data":"c39150c171aaee41f935d23fac8a6b8ed15fdf8545a56979b03e0bc1c8741f45"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.017454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"a66cb8be-67f7-46f6-90c1-914129608068","Type":"ContainerStarted","Data":"783854262f06d87db37ab931256d8570d4c48ad8794b84fa25582d426e151ccc"} Feb 26 20:12:24 crc kubenswrapper[4722]: E0226 20:12:24.017978 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" podUID="b1e5ce93-d4cd-4ef0-a71b-f63165e558cb" Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.018971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" event={"ID":"43abd91c-064b-4440-9bb9-8f9768720659","Type":"ContainerStarted","Data":"3fb2ce0645994dcdcefe5aa63b0681e1451262bc45b2ae025a98b4c768819498"} Feb 26 20:12:24 crc kubenswrapper[4722]: E0226 20:12:24.020562 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="a66cb8be-67f7-46f6-90c1-914129608068" Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.020715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7","Type":"ContainerStarted","Data":"ffd1395f65f1184da68fac5dbc73bd28dcd07dbb83c0b42e307b64b58f9c1efd"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.022724 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a913d767-5243-448d-b5e9-6112a27b6233","Type":"ContainerStarted","Data":"43ea159df0e961d5bba20f73c2ccb16ed052423970ff1d6e49f9d35103353227"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.030997 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerStarted","Data":"91652176d6f022428384f101195d66eabb6874b54c5593eced205de3eaa53d04"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.033432 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e6617222-c81a-46cc-9c98-1170f7c89846","Type":"ContainerStarted","Data":"4c5c905412b487d64b54a6c3d784b133430d8947b0b99214d7dbe7ea6a0f0b96"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.036864 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"12264086-b848-4375-9787-a2ff33b411f0","Type":"ContainerStarted","Data":"a76494c3a583e16e62320c8120fa7920942c6104515fedfa71382543cd12867e"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.038277 4722 generic.go:334] "Generic (PLEG): container finished" podID="310eccc9-804e-4a2c-ba45-adf425f191ba" containerID="7e96ceda765a495699c2b1fe964ea6c48cdbb571cfeda308f4b0e0bb2d151a87" exitCode=0 Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.038414 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535612-72dkb" event={"ID":"310eccc9-804e-4a2c-ba45-adf425f191ba","Type":"ContainerDied","Data":"7e96ceda765a495699c2b1fe964ea6c48cdbb571cfeda308f4b0e0bb2d151a87"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.038693 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" podStartSLOduration=3.160477733 podStartE2EDuration="25.038670821s" podCreationTimestamp="2026-02-26 20:11:59 +0000 UTC" firstStartedPulling="2026-02-26 20:12:00.506025997 +0000 UTC m=+1063.042993921" lastFinishedPulling="2026-02-26 20:12:22.384219085 +0000 UTC m=+1084.921187009" observedRunningTime="2026-02-26 20:12:24.030281532 +0000 UTC m=+1086.567249476" watchObservedRunningTime="2026-02-26 20:12:24.038670821 +0000 UTC m=+1086.575638755" Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.039472 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b02241f-513e-4558-b519-5bd84e5b4eff","Type":"ContainerStarted","Data":"4b7de2619faa77e1eb7478bfe0b45934e7f95b49993cf881f49177297217f430"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.041417 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe","Type":"ContainerStarted","Data":"be6d3551c4120e7a208a6b86eb03602c02f3476df9eec5e153ebe03dfeee3fc9"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.042422 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" event={"ID":"1e16be72-77f7-43fb-a6bf-04088d7c6c0b","Type":"ContainerStarted","Data":"1d7f5d377002d7695e7310770965528ef31074561800c2ae4f4b0ad06f213141"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.048215 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" event={"ID":"23fc144a-bb55-464d-8f21-94038bf68ecd","Type":"ContainerStarted","Data":"44c546f1368070c875e2cd9fb8de37579495cb6c83b8f6e79610acb5aaa55b84"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.049940 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rsgbx" event={"ID":"5c9c23c8-6fed-49f5-abe1-d44b885952ec","Type":"ContainerStarted","Data":"239e1d85cb4be124bb0073c69a9eec8f22071f3716a2238a126918a27812d7c2"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.054227 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"082c8f6a-a03f-4567-891c-56b6aa6f26d3","Type":"ContainerStarted","Data":"9d93e21cedfbb0837d16f6dbbd73f5fd4c8e159f8624d06cfa4bccc3ea3841ba"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.161093 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2995f0a7-c3bd-4a2f-8c24-2982b38076bd" path="/var/lib/kubelet/pods/2995f0a7-c3bd-4a2f-8c24-2982b38076bd/volumes" Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.161497 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51cda6ae-4351-4bcb-b533-54a4103a10a0" path="/var/lib/kubelet/pods/51cda6ae-4351-4bcb-b533-54a4103a10a0/volumes" Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.477949 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k7h8c"] Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.904833 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 20:12:25 crc kubenswrapper[4722]: E0226 20:12:25.064626 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" podUID="b1e5ce93-d4cd-4ef0-a71b-f63165e558cb" Feb 26 20:12:25 crc kubenswrapper[4722]: E0226 20:12:25.065775 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="a66cb8be-67f7-46f6-90c1-914129608068" Feb 26 20:12:25 crc kubenswrapper[4722]: W0226 20:12:25.362944 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4601fbad_d1bf_4205_86c5_a392e381300e.slice/crio-68de923b76daa138a8ea228b7652bd28e94444f49aa4ac6d4e9844336de7e00d WatchSource:0}: Error finding container 68de923b76daa138a8ea228b7652bd28e94444f49aa4ac6d4e9844336de7e00d: Status 404 returned error can't find the container with id 68de923b76daa138a8ea228b7652bd28e94444f49aa4ac6d4e9844336de7e00d Feb 26 20:12:25 crc kubenswrapper[4722]: I0226 20:12:25.431302 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:25 crc kubenswrapper[4722]: I0226 20:12:25.532425 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wrn2\" (UniqueName: \"kubernetes.io/projected/310eccc9-804e-4a2c-ba45-adf425f191ba-kube-api-access-8wrn2\") pod \"310eccc9-804e-4a2c-ba45-adf425f191ba\" (UID: \"310eccc9-804e-4a2c-ba45-adf425f191ba\") " Feb 26 20:12:25 crc kubenswrapper[4722]: I0226 20:12:25.538011 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/310eccc9-804e-4a2c-ba45-adf425f191ba-kube-api-access-8wrn2" (OuterVolumeSpecName: "kube-api-access-8wrn2") pod "310eccc9-804e-4a2c-ba45-adf425f191ba" (UID: "310eccc9-804e-4a2c-ba45-adf425f191ba"). InnerVolumeSpecName "kube-api-access-8wrn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:25 crc kubenswrapper[4722]: I0226 20:12:25.634815 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wrn2\" (UniqueName: \"kubernetes.io/projected/310eccc9-804e-4a2c-ba45-adf425f191ba-kube-api-access-8wrn2\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.067867 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535606-csqpb"] Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.074371 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535606-csqpb"] Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.079720 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535612-72dkb" event={"ID":"310eccc9-804e-4a2c-ba45-adf425f191ba","Type":"ContainerDied","Data":"932a895f64d61b01ebb9d0d936f837191496199fe815f45f2d5fbad368f7541b"} Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.079775 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="932a895f64d61b01ebb9d0d936f837191496199fe815f45f2d5fbad368f7541b" Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.079753 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.081358 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7h8c" event={"ID":"ba0fada1-7131-401e-adf3-f9e05d1bd949","Type":"ContainerStarted","Data":"1f1c7bd09a1ae8384615983ec1aaa06c41a8bf5ce1763a03ae9e848883492e27"} Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.087350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4601fbad-d1bf-4205-86c5-a392e381300e","Type":"ContainerStarted","Data":"68de923b76daa138a8ea228b7652bd28e94444f49aa4ac6d4e9844336de7e00d"} Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.159584 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3133c2f-ea60-41e1-bf7e-443c44a47c41" path="/var/lib/kubelet/pods/e3133c2f-ea60-41e1-bf7e-443c44a47c41/volumes" Feb 26 20:12:29 crc kubenswrapper[4722]: I0226 20:12:29.948416 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:12:30 crc kubenswrapper[4722]: I0226 20:12:30.018319 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hmnmf"] Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.070361 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.104016 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prztd\" (UniqueName: \"kubernetes.io/projected/08daf4e8-990e-4891-a06c-53fe8ba611db-kube-api-access-prztd\") pod \"08daf4e8-990e-4891-a06c-53fe8ba611db\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.104349 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-dns-svc\") pod \"08daf4e8-990e-4891-a06c-53fe8ba611db\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.104426 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-config\") pod \"08daf4e8-990e-4891-a06c-53fe8ba611db\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.104961 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08daf4e8-990e-4891-a06c-53fe8ba611db" (UID: "08daf4e8-990e-4891-a06c-53fe8ba611db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.105090 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-config" (OuterVolumeSpecName: "config") pod "08daf4e8-990e-4891-a06c-53fe8ba611db" (UID: "08daf4e8-990e-4891-a06c-53fe8ba611db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.110422 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08daf4e8-990e-4891-a06c-53fe8ba611db-kube-api-access-prztd" (OuterVolumeSpecName: "kube-api-access-prztd") pod "08daf4e8-990e-4891-a06c-53fe8ba611db" (UID: "08daf4e8-990e-4891-a06c-53fe8ba611db"). InnerVolumeSpecName "kube-api-access-prztd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.183838 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" event={"ID":"08daf4e8-990e-4891-a06c-53fe8ba611db","Type":"ContainerDied","Data":"7f48f332ecd5d25ff3d13cf2f281cbb54ae5fac16d6d46f482b61e9d73db0276"} Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.184011 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.206093 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.206162 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prztd\" (UniqueName: \"kubernetes.io/projected/08daf4e8-990e-4891-a06c-53fe8ba611db-kube-api-access-prztd\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.206181 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.273288 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hmnmf"] Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.279861 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hmnmf"] Feb 26 20:12:36 crc kubenswrapper[4722]: I0226 20:12:36.156732 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08daf4e8-990e-4891-a06c-53fe8ba611db" path="/var/lib/kubelet/pods/08daf4e8-990e-4891-a06c-53fe8ba611db/volumes" Feb 26 20:12:36 crc kubenswrapper[4722]: E0226 20:12:36.967084 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 26 20:12:36 crc kubenswrapper[4722]: E0226 20:12:36.967379 4722 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 26 20:12:36 crc kubenswrapper[4722]: E0226 20:12:36.967507 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rnwp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(e6617222-c81a-46cc-9c98-1170f7c89846): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 20:12:36 crc kubenswrapper[4722]: E0226 20:12:36.969679 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.202433 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0a4edaeb-4029-4586-ab06-d09489d2e944","Type":"ContainerStarted","Data":"d9456133d1f32883f9bd919e3a494f2759a7b4808f214fee976de594f40ada7b"} Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.202565 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.211543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" event={"ID":"734bb9a8-948b-4d5a-bdb1-df37ad791e6b","Type":"ContainerStarted","Data":"e8198eb82c4399ac14b3431bc806f8abdd1198a5901c58bf323a34adee0f8dbd"} Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.211662 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.224865 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ffecd786-4ba4-4d40-9b0a-aa0af47577ad","Type":"ContainerStarted","Data":"7c3d2e390de29a29f27fcf9718d03644bbb5e51dbbe7016eab27ff7091e23b8a"} Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.226783 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=22.178313221 podStartE2EDuration="34.226766807s" podCreationTimestamp="2026-02-26 20:12:03 +0000 UTC" firstStartedPulling="2026-02-26 20:12:22.907407092 +0000 UTC m=+1085.444375016" lastFinishedPulling="2026-02-26 20:12:34.955860648 +0000 UTC m=+1097.492828602" observedRunningTime="2026-02-26 20:12:37.219582811 +0000 UTC m=+1099.756550735" watchObservedRunningTime="2026-02-26 20:12:37.226766807 +0000 UTC m=+1099.763734751" Feb 26 20:12:37 crc kubenswrapper[4722]: E0226 20:12:37.226835 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.246256 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" podStartSLOduration=10.382406219 podStartE2EDuration="22.246236958s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.714389873 +0000 UTC m=+1086.251357797" lastFinishedPulling="2026-02-26 20:12:35.578220612 +0000 UTC m=+1098.115188536" observedRunningTime="2026-02-26 20:12:37.240551913 +0000 UTC m=+1099.777519857" watchObservedRunningTime="2026-02-26 20:12:37.246236958 +0000 UTC m=+1099.783204882" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.232909 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12","Type":"ContainerStarted","Data":"1369b5b2de538727922d575d1e1a9b0b199def2d9c555b40c7cabc8513bfdebe"} Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.234292 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.238156 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"082c8f6a-a03f-4567-891c-56b6aa6f26d3","Type":"ContainerStarted","Data":"fb8dfe786bc73e61b9839142bf2967dfa0c496559898b5ad895aeb950545bfda"} Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.238304 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.240146 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4601fbad-d1bf-4205-86c5-a392e381300e","Type":"ContainerStarted","Data":"3d9cabe7171b02963af5075866417446edb51805309744acc01a6d37e9b0b34c"} Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.242957 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe","Type":"ContainerStarted","Data":"817cd7e495706646fc921cad0a3b34a3006a157de327d59b87d2d83b626a1c6d"} Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.246202 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" event={"ID":"23fc144a-bb55-464d-8f21-94038bf68ecd","Type":"ContainerStarted","Data":"a303150b4f9f36cbb300d114543122d1b4f80a55fe31ebf9662f07b5e41b7945"} Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.246394 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.254297 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=11.01490618 podStartE2EDuration="23.254282276s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.798448277 +0000 UTC m=+1086.335416201" lastFinishedPulling="2026-02-26 20:12:36.037824353 +0000 UTC m=+1098.574792297" observedRunningTime="2026-02-26 20:12:38.251806058 +0000 UTC m=+1100.788773982" watchObservedRunningTime="2026-02-26 20:12:38.254282276 +0000 UTC m=+1100.791250200" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.257428 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"12264086-b848-4375-9787-a2ff33b411f0","Type":"ContainerStarted","Data":"0f60ce34762630f483010e88c02973dd91944b4c79b949e44b577ef890fc7cf5"} Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.269071 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.310550 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=11.099474366 podStartE2EDuration="23.31051698s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.826728698 +0000 UTC m=+1086.363696622" lastFinishedPulling="2026-02-26 20:12:36.037771312 +0000 UTC m=+1098.574739236" observedRunningTime="2026-02-26 20:12:38.309896663 +0000 UTC m=+1100.846864617" watchObservedRunningTime="2026-02-26 20:12:38.31051698 +0000 UTC m=+1100.847484914" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.341605 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" podStartSLOduration=11.558906033 podStartE2EDuration="23.341579698s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.796573485 +0000 UTC m=+1086.333541409" lastFinishedPulling="2026-02-26 20:12:35.57924715 +0000 UTC m=+1098.116215074" observedRunningTime="2026-02-26 20:12:38.338181575 +0000 UTC m=+1100.875149509" watchObservedRunningTime="2026-02-26 20:12:38.341579698 +0000 UTC m=+1100.878547652" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.259195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rsgbx" event={"ID":"5c9c23c8-6fed-49f5-abe1-d44b885952ec","Type":"ContainerStarted","Data":"a1d8a50ae032048c486454579c044ae024d92e0600c45d28e8d8f77d371d6cb4"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.259602 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.263960 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" event={"ID":"43abd91c-064b-4440-9bb9-8f9768720659","Type":"ContainerStarted","Data":"bcc4d7992e2edcf10d20452b26d99c4c6199bba4f0da36a93d2530268b501f2a"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.264156 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.267433 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" event={"ID":"1e16be72-77f7-43fb-a6bf-04088d7c6c0b","Type":"ContainerStarted","Data":"623ca138967fb764da843f3c1d43b086537c141c481a2439ab97c4a29ea3cd82"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.267518 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.269251 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7","Type":"ContainerStarted","Data":"f9a09f4392a73c09c0c6796e33db718f151415d88ff4a69a2d8e38a5f05ec00a"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.272577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerStarted","Data":"14fbb8b26e4f9d83af6cec452e3f2c248bbf5a480b7ca6fc07aadd400140ba7b"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.286129 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7h8c" event={"ID":"ba0fada1-7131-401e-adf3-f9e05d1bd949","Type":"ContainerStarted","Data":"5fb6692a5d3fa0e95a6e5fdc04cd8695218e68b6ae14a6bd0538470d41b60e85"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.288069 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" event={"ID":"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb","Type":"ContainerStarted","Data":"00d98aa3ca2cd92a65c8fd46ca0050881392fbbcd813e0c809c5d5ab9f2ab402"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.289675 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.290982 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rsgbx" podStartSLOduration=18.180435897 podStartE2EDuration="30.290964197s" podCreationTimestamp="2026-02-26 20:12:09 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.743711703 +0000 UTC m=+1086.280679627" lastFinishedPulling="2026-02-26 20:12:35.854240003 +0000 UTC m=+1098.391207927" observedRunningTime="2026-02-26 20:12:39.274812849 +0000 UTC m=+1101.811780773" watchObservedRunningTime="2026-02-26 20:12:39.290964197 +0000 UTC m=+1101.827932111" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.293903 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.295414 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"a66cb8be-67f7-46f6-90c1-914129608068","Type":"ContainerStarted","Data":"5b7b12a803aef39c920527eb5aa55cebe8c68af1acdaea97e31212c93ea6241d"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.296469 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.319872 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" podStartSLOduration=11.970202094 podStartE2EDuration="24.319807328s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.798391895 +0000 UTC m=+1086.335359819" lastFinishedPulling="2026-02-26 20:12:36.147997129 +0000 UTC m=+1098.684965053" observedRunningTime="2026-02-26 20:12:39.318302368 +0000 UTC m=+1101.855270322" watchObservedRunningTime="2026-02-26 20:12:39.319807328 +0000 UTC m=+1101.856775272" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.348291 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" podStartSLOduration=12.064830249 podStartE2EDuration="24.348269248s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.754275311 +0000 UTC m=+1086.291243235" lastFinishedPulling="2026-02-26 20:12:36.03771431 +0000 UTC m=+1098.574682234" observedRunningTime="2026-02-26 20:12:39.344053305 +0000 UTC m=+1101.881021249" watchObservedRunningTime="2026-02-26 20:12:39.348269248 +0000 UTC m=+1101.885237172" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.394229 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=-9223372012.46057 podStartE2EDuration="24.394206793s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.834381577 +0000 UTC m=+1086.371349501" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:39.390071321 +0000 UTC m=+1101.927039265" watchObservedRunningTime="2026-02-26 20:12:39.394206793 +0000 UTC m=+1101.931174717" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.409166 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" podStartSLOduration=-9223372012.445642 podStartE2EDuration="24.409133637s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.837886303 +0000 UTC m=+1086.374854227" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:39.407161124 +0000 UTC m=+1101.944129088" watchObservedRunningTime="2026-02-26 20:12:39.409133637 +0000 UTC m=+1101.946101561" Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.302802 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4601fbad-d1bf-4205-86c5-a392e381300e","Type":"ContainerStarted","Data":"dc871fb12818708591ce63c9841a00dad813ba953abe385df9b2183850eb2c6c"} Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.305653 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe","Type":"ContainerStarted","Data":"800e4ebb73a14c6f1e1f5f89e478ff8aa5065bea26c540a288dc6d1a7515ba28"} Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.308154 4722 generic.go:334] "Generic (PLEG): container finished" podID="ba0fada1-7131-401e-adf3-f9e05d1bd949" containerID="5fb6692a5d3fa0e95a6e5fdc04cd8695218e68b6ae14a6bd0538470d41b60e85" exitCode=0 Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.309011 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7h8c" event={"ID":"ba0fada1-7131-401e-adf3-f9e05d1bd949","Type":"ContainerDied","Data":"5fb6692a5d3fa0e95a6e5fdc04cd8695218e68b6ae14a6bd0538470d41b60e85"} Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.341276 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.683443231 podStartE2EDuration="32.341260114s" podCreationTimestamp="2026-02-26 20:12:08 +0000 UTC" firstStartedPulling="2026-02-26 20:12:25.370096974 +0000 UTC m=+1087.907064898" lastFinishedPulling="2026-02-26 20:12:40.027913867 +0000 UTC m=+1102.564881781" observedRunningTime="2026-02-26 20:12:40.330331678 +0000 UTC m=+1102.867299642" watchObservedRunningTime="2026-02-26 20:12:40.341260114 +0000 UTC m=+1102.878228038" Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.346415 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.346554 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.391017 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.289699631 podStartE2EDuration="27.391000261s" podCreationTimestamp="2026-02-26 20:12:13 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.935423885 +0000 UTC m=+1086.472391809" lastFinishedPulling="2026-02-26 20:12:40.036724495 +0000 UTC m=+1102.573692439" observedRunningTime="2026-02-26 20:12:40.378276466 +0000 UTC m=+1102.915244400" watchObservedRunningTime="2026-02-26 20:12:40.391000261 +0000 UTC m=+1102.927968185" Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.402379 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.318611 4722 generic.go:334] "Generic (PLEG): container finished" podID="ffecd786-4ba4-4d40-9b0a-aa0af47577ad" containerID="7c3d2e390de29a29f27fcf9718d03644bbb5e51dbbe7016eab27ff7091e23b8a" exitCode=0 Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.318739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ffecd786-4ba4-4d40-9b0a-aa0af47577ad","Type":"ContainerDied","Data":"7c3d2e390de29a29f27fcf9718d03644bbb5e51dbbe7016eab27ff7091e23b8a"} Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.324116 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7h8c" event={"ID":"ba0fada1-7131-401e-adf3-f9e05d1bd949","Type":"ContainerStarted","Data":"8b36f1a0086f9667a6f1d4421892dbf02ea861821a6f01877f25d09147844a46"} Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.324170 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7h8c" event={"ID":"ba0fada1-7131-401e-adf3-f9e05d1bd949","Type":"ContainerStarted","Data":"66d446b0dd5eeae72079f0ece8bad20014fae6e11a8eb8b38c3f8b4de38c91bd"} Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.324800 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.372604 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-k7h8c" podStartSLOduration=21.801833093 podStartE2EDuration="32.372436683s" podCreationTimestamp="2026-02-26 20:12:09 +0000 UTC" firstStartedPulling="2026-02-26 20:12:25.36889256 +0000 UTC m=+1087.905860484" lastFinishedPulling="2026-02-26 20:12:35.93949611 +0000 UTC m=+1098.476464074" observedRunningTime="2026-02-26 20:12:41.366106741 +0000 UTC m=+1103.903074685" watchObservedRunningTime="2026-02-26 20:12:41.372436683 +0000 UTC m=+1103.909404637" Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.658778 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.720966 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.333001 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.333036 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.379079 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.385691 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.577342 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2qnbx"] Feb 26 20:12:42 crc kubenswrapper[4722]: E0226 20:12:42.579826 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310eccc9-804e-4a2c-ba45-adf425f191ba" containerName="oc" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.579858 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="310eccc9-804e-4a2c-ba45-adf425f191ba" containerName="oc" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.580108 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="310eccc9-804e-4a2c-ba45-adf425f191ba" containerName="oc" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.584369 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.587881 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2qnbx"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.591242 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.635100 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nfkn8"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.636236 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.640094 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.645798 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nfkn8"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678442 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm4r9\" (UniqueName: \"kubernetes.io/projected/bbebf8c1-c827-4450-9afc-4a89f4758d42-kube-api-access-fm4r9\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678545 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/721ad050-b6a8-432b-89b0-226c0efa6222-ovs-rundir\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678626 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721ad050-b6a8-432b-89b0-226c0efa6222-combined-ca-bundle\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/721ad050-b6a8-432b-89b0-226c0efa6222-ovn-rundir\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr52g\" (UniqueName: \"kubernetes.io/projected/721ad050-b6a8-432b-89b0-226c0efa6222-kube-api-access-fr52g\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678996 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/721ad050-b6a8-432b-89b0-226c0efa6222-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.679039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/721ad050-b6a8-432b-89b0-226c0efa6222-config\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.679211 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.679280 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-config\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.721245 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2qnbx"] Feb 26 20:12:42 crc kubenswrapper[4722]: E0226 20:12:42.722263 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-fm4r9 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" podUID="bbebf8c1-c827-4450-9afc-4a89f4758d42" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.756799 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-z5nvk"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.758173 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.764379 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.772196 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z5nvk"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.781775 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm4r9\" (UniqueName: \"kubernetes.io/projected/bbebf8c1-c827-4450-9afc-4a89f4758d42-kube-api-access-fm4r9\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.781829 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.781880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/721ad050-b6a8-432b-89b0-226c0efa6222-ovs-rundir\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.781902 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721ad050-b6a8-432b-89b0-226c0efa6222-combined-ca-bundle\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.781925 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/721ad050-b6a8-432b-89b0-226c0efa6222-ovn-rundir\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.781991 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr52g\" (UniqueName: \"kubernetes.io/projected/721ad050-b6a8-432b-89b0-226c0efa6222-kube-api-access-fr52g\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.782039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/721ad050-b6a8-432b-89b0-226c0efa6222-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.782075 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/721ad050-b6a8-432b-89b0-226c0efa6222-config\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.782114 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.782157 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-config\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.783105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-config\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.783884 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.784129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/721ad050-b6a8-432b-89b0-226c0efa6222-ovs-rundir\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.786283 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/721ad050-b6a8-432b-89b0-226c0efa6222-config\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.787950 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.788021 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/721ad050-b6a8-432b-89b0-226c0efa6222-ovn-rundir\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.794614 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/721ad050-b6a8-432b-89b0-226c0efa6222-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.808399 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721ad050-b6a8-432b-89b0-226c0efa6222-combined-ca-bundle\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.819667 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm4r9\" (UniqueName: \"kubernetes.io/projected/bbebf8c1-c827-4450-9afc-4a89f4758d42-kube-api-access-fm4r9\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.820784 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr52g\" (UniqueName: \"kubernetes.io/projected/721ad050-b6a8-432b-89b0-226c0efa6222-kube-api-access-fr52g\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.834401 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.835947 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.839875 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.840046 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.841006 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.841252 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.841365 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5rq6k" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.883755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.883808 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpb29\" (UniqueName: \"kubernetes.io/projected/3bcff378-b980-4f5a-b7dd-e2b84158425d-kube-api-access-vpb29\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-config\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884239 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6tt8\" (UniqueName: \"kubernetes.io/projected/c64118dc-ed6e-478a-9c59-d7e24212daba-kube-api-access-z6tt8\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884496 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-dns-svc\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884564 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64118dc-ed6e-478a-9c59-d7e24212daba-config\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c64118dc-ed6e-478a-9c59-d7e24212daba-scripts\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884694 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c64118dc-ed6e-478a-9c59-d7e24212daba-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.957261 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-config\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985812 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985848 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985867 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6tt8\" (UniqueName: \"kubernetes.io/projected/c64118dc-ed6e-478a-9c59-d7e24212daba-kube-api-access-z6tt8\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985887 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985934 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-dns-svc\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985958 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985977 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64118dc-ed6e-478a-9c59-d7e24212daba-config\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.986005 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c64118dc-ed6e-478a-9c59-d7e24212daba-scripts\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.986024 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c64118dc-ed6e-478a-9c59-d7e24212daba-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.986070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.986094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpb29\" (UniqueName: \"kubernetes.io/projected/3bcff378-b980-4f5a-b7dd-e2b84158425d-kube-api-access-vpb29\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.987547 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-config\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.988663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c64118dc-ed6e-478a-9c59-d7e24212daba-scripts\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.988712 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64118dc-ed6e-478a-9c59-d7e24212daba-config\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.988739 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c64118dc-ed6e-478a-9c59-d7e24212daba-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.989396 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.989420 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.989905 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-dns-svc\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.992352 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.994175 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.994175 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.002371 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpb29\" (UniqueName: \"kubernetes.io/projected/3bcff378-b980-4f5a-b7dd-e2b84158425d-kube-api-access-vpb29\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.006326 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6tt8\" (UniqueName: \"kubernetes.io/projected/c64118dc-ed6e-478a-9c59-d7e24212daba-kube-api-access-z6tt8\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.079639 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.173242 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.344319 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ffecd786-4ba4-4d40-9b0a-aa0af47577ad","Type":"ContainerStarted","Data":"27dfcb20c41e95a102b3bb1a1e40de41d839f23c27a6860c61db9b2e9dd97c33"} Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.346077 4722 generic.go:334] "Generic (PLEG): container finished" podID="12264086-b848-4375-9787-a2ff33b411f0" containerID="0f60ce34762630f483010e88c02973dd91944b4c79b949e44b577ef890fc7cf5" exitCode=0 Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.346466 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.346550 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"12264086-b848-4375-9787-a2ff33b411f0","Type":"ContainerDied","Data":"0f60ce34762630f483010e88c02973dd91944b4c79b949e44b577ef890fc7cf5"} Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.365042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.370532 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=29.920580117 podStartE2EDuration="42.370514659s" podCreationTimestamp="2026-02-26 20:12:01 +0000 UTC" firstStartedPulling="2026-02-26 20:12:22.505908866 +0000 UTC m=+1085.042876790" lastFinishedPulling="2026-02-26 20:12:34.955843398 +0000 UTC m=+1097.492811332" observedRunningTime="2026-02-26 20:12:43.36092457 +0000 UTC m=+1105.897892504" watchObservedRunningTime="2026-02-26 20:12:43.370514659 +0000 UTC m=+1105.907482583" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.415880 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nfkn8"] Feb 26 20:12:43 crc kubenswrapper[4722]: W0226 20:12:43.430428 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod721ad050_b6a8_432b_89b0_226c0efa6222.slice/crio-d011bd8bdccaf11a813498f4153a8e1fca879d0a7fe36262515f4d5c976a3b91 WatchSource:0}: Error finding container d011bd8bdccaf11a813498f4153a8e1fca879d0a7fe36262515f4d5c976a3b91: Status 404 returned error can't find the container with id d011bd8bdccaf11a813498f4153a8e1fca879d0a7fe36262515f4d5c976a3b91 Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.496544 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-config\") pod \"bbebf8c1-c827-4450-9afc-4a89f4758d42\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.496656 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm4r9\" (UniqueName: \"kubernetes.io/projected/bbebf8c1-c827-4450-9afc-4a89f4758d42-kube-api-access-fm4r9\") pod \"bbebf8c1-c827-4450-9afc-4a89f4758d42\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.496728 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-dns-svc\") pod \"bbebf8c1-c827-4450-9afc-4a89f4758d42\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.496838 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-ovsdbserver-nb\") pod \"bbebf8c1-c827-4450-9afc-4a89f4758d42\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.499196 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bbebf8c1-c827-4450-9afc-4a89f4758d42" (UID: "bbebf8c1-c827-4450-9afc-4a89f4758d42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.499576 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bbebf8c1-c827-4450-9afc-4a89f4758d42" (UID: "bbebf8c1-c827-4450-9afc-4a89f4758d42"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.501459 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbebf8c1-c827-4450-9afc-4a89f4758d42-kube-api-access-fm4r9" (OuterVolumeSpecName: "kube-api-access-fm4r9") pod "bbebf8c1-c827-4450-9afc-4a89f4758d42" (UID: "bbebf8c1-c827-4450-9afc-4a89f4758d42"). InnerVolumeSpecName "kube-api-access-fm4r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.502467 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-config" (OuterVolumeSpecName: "config") pod "bbebf8c1-c827-4450-9afc-4a89f4758d42" (UID: "bbebf8c1-c827-4450-9afc-4a89f4758d42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.575997 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z5nvk"] Feb 26 20:12:43 crc kubenswrapper[4722]: W0226 20:12:43.582984 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bcff378_b980_4f5a_b7dd_e2b84158425d.slice/crio-a85edf53004ce9cf77de668dcb00fe2dec2b43ed4a954573e9c8470fda147490 WatchSource:0}: Error finding container a85edf53004ce9cf77de668dcb00fe2dec2b43ed4a954573e9c8470fda147490: Status 404 returned error can't find the container with id a85edf53004ce9cf77de668dcb00fe2dec2b43ed4a954573e9c8470fda147490 Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.599694 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.599725 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm4r9\" (UniqueName: \"kubernetes.io/projected/bbebf8c1-c827-4450-9afc-4a89f4758d42-kube-api-access-fm4r9\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.599736 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.599747 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:43 crc kubenswrapper[4722]: W0226 20:12:43.776517 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc64118dc_ed6e_478a_9c59_d7e24212daba.slice/crio-f5525c9ec35231b250aacac148133e141f6cd2fa8084d37f6f36a857e2909135 WatchSource:0}: Error finding container f5525c9ec35231b250aacac148133e141f6cd2fa8084d37f6f36a857e2909135: Status 404 returned error can't find the container with id f5525c9ec35231b250aacac148133e141f6cd2fa8084d37f6f36a857e2909135 Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.777109 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.045885 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.355952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"12264086-b848-4375-9787-a2ff33b411f0","Type":"ContainerStarted","Data":"00ed4ed2eeb4ce9785a9647c5714ff2916aef556f5db279f1767c15db23e2e7c"} Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.359300 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerID="52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec" exitCode=0 Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.359370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z5nvk" event={"ID":"3bcff378-b980-4f5a-b7dd-e2b84158425d","Type":"ContainerDied","Data":"52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec"} Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.359397 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z5nvk" event={"ID":"3bcff378-b980-4f5a-b7dd-e2b84158425d","Type":"ContainerStarted","Data":"a85edf53004ce9cf77de668dcb00fe2dec2b43ed4a954573e9c8470fda147490"} Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.360739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nfkn8" event={"ID":"721ad050-b6a8-432b-89b0-226c0efa6222","Type":"ContainerStarted","Data":"d10b6dc46d77dd8fc279048ce9920754fc45ddd6dbc1ecf537e7086fe59cf5eb"} Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.360795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nfkn8" event={"ID":"721ad050-b6a8-432b-89b0-226c0efa6222","Type":"ContainerStarted","Data":"d011bd8bdccaf11a813498f4153a8e1fca879d0a7fe36262515f4d5c976a3b91"} Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.362658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c64118dc-ed6e-478a-9c59-d7e24212daba","Type":"ContainerStarted","Data":"f5525c9ec35231b250aacac148133e141f6cd2fa8084d37f6f36a857e2909135"} Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.362710 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.378832 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.102841464 podStartE2EDuration="42.378807279s" podCreationTimestamp="2026-02-26 20:12:02 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.774698638 +0000 UTC m=+1086.311666562" lastFinishedPulling="2026-02-26 20:12:36.050664453 +0000 UTC m=+1098.587632377" observedRunningTime="2026-02-26 20:12:44.37809427 +0000 UTC m=+1106.915062204" watchObservedRunningTime="2026-02-26 20:12:44.378807279 +0000 UTC m=+1106.915775203" Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.409351 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nfkn8" podStartSLOduration=2.409330385 podStartE2EDuration="2.409330385s" podCreationTimestamp="2026-02-26 20:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:44.394325379 +0000 UTC m=+1106.931293333" watchObservedRunningTime="2026-02-26 20:12:44.409330385 +0000 UTC m=+1106.946298309" Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.497174 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2qnbx"] Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.503833 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2qnbx"] Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.374426 4722 generic.go:334] "Generic (PLEG): container finished" podID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerID="14fbb8b26e4f9d83af6cec452e3f2c248bbf5a480b7ca6fc07aadd400140ba7b" exitCode=0 Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.374527 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerDied","Data":"14fbb8b26e4f9d83af6cec452e3f2c248bbf5a480b7ca6fc07aadd400140ba7b"} Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.383423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z5nvk" event={"ID":"3bcff378-b980-4f5a-b7dd-e2b84158425d","Type":"ContainerStarted","Data":"4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7"} Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.383572 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.387481 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c64118dc-ed6e-478a-9c59-d7e24212daba","Type":"ContainerStarted","Data":"277f9e5291c1e61826af01c8c0ee82a4f680c36af5d12e47af9057fb8efdfa6f"} Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.387526 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c64118dc-ed6e-478a-9c59-d7e24212daba","Type":"ContainerStarted","Data":"fc22b880b55d09e4eaaf213cc5c859b3c2ada611e186b7a7184d61733e4760df"} Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.387573 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.390178 4722 generic.go:334] "Generic (PLEG): container finished" podID="36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7" containerID="f9a09f4392a73c09c0c6796e33db718f151415d88ff4a69a2d8e38a5f05ec00a" exitCode=0 Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.390933 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7","Type":"ContainerDied","Data":"f9a09f4392a73c09c0c6796e33db718f151415d88ff4a69a2d8e38a5f05ec00a"} Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.448429 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-z5nvk" podStartSLOduration=3.448410879 podStartE2EDuration="3.448410879s" podCreationTimestamp="2026-02-26 20:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:45.441673606 +0000 UTC m=+1107.978641530" watchObservedRunningTime="2026-02-26 20:12:45.448410879 +0000 UTC m=+1107.985378803" Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.471831 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.512204942 podStartE2EDuration="3.471811762s" podCreationTimestamp="2026-02-26 20:12:42 +0000 UTC" firstStartedPulling="2026-02-26 20:12:43.778696495 +0000 UTC m=+1106.315664419" lastFinishedPulling="2026-02-26 20:12:44.738303325 +0000 UTC m=+1107.275271239" observedRunningTime="2026-02-26 20:12:45.461316438 +0000 UTC m=+1107.998284372" watchObservedRunningTime="2026-02-26 20:12:45.471811762 +0000 UTC m=+1108.008779686" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.024629 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z5nvk"] Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.063827 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6v647"] Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.069458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.076297 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6v647"] Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.146429 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.146500 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6xwr\" (UniqueName: \"kubernetes.io/projected/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-kube-api-access-v6xwr\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.146527 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.146586 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-config\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.146687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.157416 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbebf8c1-c827-4450-9afc-4a89f4758d42" path="/var/lib/kubelet/pods/bbebf8c1-c827-4450-9afc-4a89f4758d42/volumes" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.248312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.248361 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.248411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6xwr\" (UniqueName: \"kubernetes.io/projected/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-kube-api-access-v6xwr\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.248444 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.248479 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-config\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.249290 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-config\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.249785 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.250616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.251150 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.268992 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6xwr\" (UniqueName: \"kubernetes.io/projected/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-kube-api-access-v6xwr\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.385700 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.847994 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6v647"] Feb 26 20:12:46 crc kubenswrapper[4722]: W0226 20:12:46.851176 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8aa05bc_6ef2_48f1_83c4_2009a9b33e40.slice/crio-cf16f1dda34bfcc17c892a970aa5367685f77067da2bbedfa81960093267432f WatchSource:0}: Error finding container cf16f1dda34bfcc17c892a970aa5367685f77067da2bbedfa81960093267432f: Status 404 returned error can't find the container with id cf16f1dda34bfcc17c892a970aa5367685f77067da2bbedfa81960093267432f Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.170941 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.179216 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.181933 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.183029 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.183038 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-gh256" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.183450 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.194954 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.266444 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqn8m\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-kube-api-access-zqn8m\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.266501 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.266534 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29033310-ec4f-49d0-8899-349e3c6b02f9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.266572 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.266589 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/29033310-ec4f-49d0-8899-349e3c6b02f9-cache\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.266616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/29033310-ec4f-49d0-8899-349e3c6b02f9-lock\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.367931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqn8m\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-kube-api-access-zqn8m\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.368028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.368066 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29033310-ec4f-49d0-8899-349e3c6b02f9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.368127 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.368174 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/29033310-ec4f-49d0-8899-349e3c6b02f9-cache\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.368210 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/29033310-ec4f-49d0-8899-349e3c6b02f9-lock\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: E0226 20:12:47.368597 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 20:12:47 crc kubenswrapper[4722]: E0226 20:12:47.368630 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 20:12:47 crc kubenswrapper[4722]: E0226 20:12:47.368696 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift podName:29033310-ec4f-49d0-8899-349e3c6b02f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:12:47.868679018 +0000 UTC m=+1110.405646932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift") pod "swift-storage-0" (UID: "29033310-ec4f-49d0-8899-349e3c6b02f9") : configmap "swift-ring-files" not found Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.369287 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/29033310-ec4f-49d0-8899-349e3c6b02f9-cache\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.369377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/29033310-ec4f-49d0-8899-349e3c6b02f9-lock\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.372572 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.372623 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/65a9cd87adf4cce73990a0e2381601df4f2b796197e9f55bedb53dfac08c1ac2/globalmount\"" pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.373247 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29033310-ec4f-49d0-8899-349e3c6b02f9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.398966 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqn8m\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-kube-api-access-zqn8m\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.399769 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.416995 4722 generic.go:334] "Generic (PLEG): container finished" podID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerID="f496200801d5a8d3ad48ad4beed803937d066c9796fef300a5c24e89fc2e832c" exitCode=0 Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.417056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" event={"ID":"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40","Type":"ContainerDied","Data":"f496200801d5a8d3ad48ad4beed803937d066c9796fef300a5c24e89fc2e832c"} Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.417109 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" event={"ID":"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40","Type":"ContainerStarted","Data":"cf16f1dda34bfcc17c892a970aa5367685f77067da2bbedfa81960093267432f"} Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.417195 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-z5nvk" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerName="dnsmasq-dns" containerID="cri-o://4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7" gracePeriod=10 Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.795002 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vfmbj"] Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.796167 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.808612 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.808936 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.809012 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.836198 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vfmbj"] Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-combined-ca-bundle\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880157 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-ring-data-devices\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880231 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-etc-swift\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880255 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-scripts\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880283 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-swiftconf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880331 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbvbf\" (UniqueName: \"kubernetes.io/projected/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-kube-api-access-cbvbf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880354 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-dispersionconf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: E0226 20:12:47.880492 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 20:12:47 crc kubenswrapper[4722]: E0226 20:12:47.880505 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 20:12:47 crc kubenswrapper[4722]: E0226 20:12:47.880541 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift podName:29033310-ec4f-49d0-8899-349e3c6b02f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:12:48.880528161 +0000 UTC m=+1111.417496085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift") pod "swift-storage-0" (UID: "29033310-ec4f-49d0-8899-349e3c6b02f9") : configmap "swift-ring-files" not found Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.923018 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.982169 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-dns-svc\") pod \"3bcff378-b980-4f5a-b7dd-e2b84158425d\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.982308 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-sb\") pod \"3bcff378-b980-4f5a-b7dd-e2b84158425d\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.982356 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-nb\") pod \"3bcff378-b980-4f5a-b7dd-e2b84158425d\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.982467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-config\") pod \"3bcff378-b980-4f5a-b7dd-e2b84158425d\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.982556 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpb29\" (UniqueName: \"kubernetes.io/projected/3bcff378-b980-4f5a-b7dd-e2b84158425d-kube-api-access-vpb29\") pod \"3bcff378-b980-4f5a-b7dd-e2b84158425d\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.989538 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-etc-swift\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.989871 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-scripts\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.990005 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-swiftconf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.990531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbvbf\" (UniqueName: \"kubernetes.io/projected/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-kube-api-access-cbvbf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.990602 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-dispersionconf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.994294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-combined-ca-bundle\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.994500 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-ring-data-devices\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.000490 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-etc-swift\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.005833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-scripts\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.009078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-dispersionconf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.012668 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-swiftconf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.013163 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bcff378-b980-4f5a-b7dd-e2b84158425d-kube-api-access-vpb29" (OuterVolumeSpecName: "kube-api-access-vpb29") pod "3bcff378-b980-4f5a-b7dd-e2b84158425d" (UID: "3bcff378-b980-4f5a-b7dd-e2b84158425d"). InnerVolumeSpecName "kube-api-access-vpb29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.015547 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-combined-ca-bundle\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.016121 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-ring-data-devices\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.018115 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbvbf\" (UniqueName: \"kubernetes.io/projected/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-kube-api-access-cbvbf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.038131 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3bcff378-b980-4f5a-b7dd-e2b84158425d" (UID: "3bcff378-b980-4f5a-b7dd-e2b84158425d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.039418 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3bcff378-b980-4f5a-b7dd-e2b84158425d" (UID: "3bcff378-b980-4f5a-b7dd-e2b84158425d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.044421 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3bcff378-b980-4f5a-b7dd-e2b84158425d" (UID: "3bcff378-b980-4f5a-b7dd-e2b84158425d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.056775 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-config" (OuterVolumeSpecName: "config") pod "3bcff378-b980-4f5a-b7dd-e2b84158425d" (UID: "3bcff378-b980-4f5a-b7dd-e2b84158425d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.109217 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.109256 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpb29\" (UniqueName: \"kubernetes.io/projected/3bcff378-b980-4f5a-b7dd-e2b84158425d-kube-api-access-vpb29\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.109267 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.109275 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.109285 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.174359 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.427514 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerID="4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7" exitCode=0 Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.427582 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z5nvk" event={"ID":"3bcff378-b980-4f5a-b7dd-e2b84158425d","Type":"ContainerDied","Data":"4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7"} Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.427615 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z5nvk" event={"ID":"3bcff378-b980-4f5a-b7dd-e2b84158425d","Type":"ContainerDied","Data":"a85edf53004ce9cf77de668dcb00fe2dec2b43ed4a954573e9c8470fda147490"} Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.427633 4722 scope.go:117] "RemoveContainer" containerID="4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.427754 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.435074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" event={"ID":"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40","Type":"ContainerStarted","Data":"4a129b8c1723572fe4add0f6ebd0ad819a9755d241f4d2d09aa4fac6abaef325"} Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.435294 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.465645 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" podStartSLOduration=2.465623588 podStartE2EDuration="2.465623588s" podCreationTimestamp="2026-02-26 20:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:48.457825757 +0000 UTC m=+1110.994793681" watchObservedRunningTime="2026-02-26 20:12:48.465623588 +0000 UTC m=+1111.002591512" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.474215 4722 scope.go:117] "RemoveContainer" containerID="52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.494726 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z5nvk"] Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.513351 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z5nvk"] Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.514220 4722 scope.go:117] "RemoveContainer" containerID="4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7" Feb 26 20:12:48 crc kubenswrapper[4722]: E0226 20:12:48.514946 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7\": container with ID starting with 4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7 not found: ID does not exist" containerID="4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.514981 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7"} err="failed to get container status \"4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7\": rpc error: code = NotFound desc = could not find container \"4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7\": container with ID starting with 4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7 not found: ID does not exist" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.515002 4722 scope.go:117] "RemoveContainer" containerID="52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec" Feb 26 20:12:48 crc kubenswrapper[4722]: E0226 20:12:48.515351 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec\": container with ID starting with 52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec not found: ID does not exist" containerID="52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.515389 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec"} err="failed to get container status \"52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec\": rpc error: code = NotFound desc = could not find container \"52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec\": container with ID starting with 52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec not found: ID does not exist" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.657818 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vfmbj"] Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.924089 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:48 crc kubenswrapper[4722]: E0226 20:12:48.924276 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 20:12:48 crc kubenswrapper[4722]: E0226 20:12:48.924290 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 20:12:48 crc kubenswrapper[4722]: E0226 20:12:48.924327 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift podName:29033310-ec4f-49d0-8899-349e3c6b02f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:12:50.924312552 +0000 UTC m=+1113.461280476 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift") pod "swift-storage-0" (UID: "29033310-ec4f-49d0-8899-349e3c6b02f9") : configmap "swift-ring-files" not found Feb 26 20:12:49 crc kubenswrapper[4722]: I0226 20:12:49.453221 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vfmbj" event={"ID":"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21","Type":"ContainerStarted","Data":"70f35b30d8f0ee722cbd2d642a2674e953088f43c6b4fd7d52bc9500b83ef9ce"} Feb 26 20:12:50 crc kubenswrapper[4722]: I0226 20:12:50.156170 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" path="/var/lib/kubelet/pods/3bcff378-b980-4f5a-b7dd-e2b84158425d/volumes" Feb 26 20:12:50 crc kubenswrapper[4722]: I0226 20:12:50.462552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7","Type":"ContainerStarted","Data":"27fb0a59a0ec5b03537213b2d5da3ff610d4773c7de5a63c6bcdba6e8eabf611"} Feb 26 20:12:50 crc kubenswrapper[4722]: I0226 20:12:50.969677 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:50 crc kubenswrapper[4722]: E0226 20:12:50.969902 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 20:12:50 crc kubenswrapper[4722]: E0226 20:12:50.969924 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 20:12:50 crc kubenswrapper[4722]: E0226 20:12:50.969987 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift podName:29033310-ec4f-49d0-8899-349e3c6b02f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:12:54.969968087 +0000 UTC m=+1117.506936011 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift") pod "swift-storage-0" (UID: "29033310-ec4f-49d0-8899-349e3c6b02f9") : configmap "swift-ring-files" not found Feb 26 20:12:52 crc kubenswrapper[4722]: I0226 20:12:52.528281 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 26 20:12:52 crc kubenswrapper[4722]: I0226 20:12:52.528915 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 26 20:12:52 crc kubenswrapper[4722]: I0226 20:12:52.610801 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.489170 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7","Type":"ContainerStarted","Data":"424b228d6e6fdef71ae464e6cfa44b75c6d3e46a6cbb3b7e16cadf7276478d50"} Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.489861 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.493181 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.519481 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=20.904326088 podStartE2EDuration="47.519462649s" podCreationTimestamp="2026-02-26 20:12:06 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.116420415 +0000 UTC m=+1085.653388329" lastFinishedPulling="2026-02-26 20:12:49.731556966 +0000 UTC m=+1112.268524890" observedRunningTime="2026-02-26 20:12:53.512892782 +0000 UTC m=+1116.049860756" watchObservedRunningTime="2026-02-26 20:12:53.519462649 +0000 UTC m=+1116.056430573" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.578906 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.684765 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.684824 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.767741 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.250520 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-40c9-account-create-update-6b2zr"] Feb 26 20:12:54 crc kubenswrapper[4722]: E0226 20:12:54.252175 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerName="init" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.252327 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerName="init" Feb 26 20:12:54 crc kubenswrapper[4722]: E0226 20:12:54.252401 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerName="dnsmasq-dns" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.252462 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerName="dnsmasq-dns" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.252749 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerName="dnsmasq-dns" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.253458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.256181 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.271546 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-40c9-account-create-update-6b2zr"] Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.281914 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lrpx8"] Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.283078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.305573 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lrpx8"] Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.352895 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12bb8485-56aa-436e-abd8-5e63601f2ab8-operator-scripts\") pod \"glance-40c9-account-create-update-6b2zr\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.353102 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vftz2\" (UniqueName: \"kubernetes.io/projected/12bb8485-56aa-436e-abd8-5e63601f2ab8-kube-api-access-vftz2\") pod \"glance-40c9-account-create-update-6b2zr\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.455127 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8zz\" (UniqueName: \"kubernetes.io/projected/f4ffd934-6139-4ef1-92b2-a30b7798fe61-kube-api-access-fc8zz\") pod \"glance-db-create-lrpx8\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.455270 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ffd934-6139-4ef1-92b2-a30b7798fe61-operator-scripts\") pod \"glance-db-create-lrpx8\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.455543 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vftz2\" (UniqueName: \"kubernetes.io/projected/12bb8485-56aa-436e-abd8-5e63601f2ab8-kube-api-access-vftz2\") pod \"glance-40c9-account-create-update-6b2zr\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.455619 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12bb8485-56aa-436e-abd8-5e63601f2ab8-operator-scripts\") pod \"glance-40c9-account-create-update-6b2zr\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.456363 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12bb8485-56aa-436e-abd8-5e63601f2ab8-operator-scripts\") pod \"glance-40c9-account-create-update-6b2zr\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.480236 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vftz2\" (UniqueName: \"kubernetes.io/projected/12bb8485-56aa-436e-abd8-5e63601f2ab8-kube-api-access-vftz2\") pod \"glance-40c9-account-create-update-6b2zr\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.557944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8zz\" (UniqueName: \"kubernetes.io/projected/f4ffd934-6139-4ef1-92b2-a30b7798fe61-kube-api-access-fc8zz\") pod \"glance-db-create-lrpx8\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.557997 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ffd934-6139-4ef1-92b2-a30b7798fe61-operator-scripts\") pod \"glance-db-create-lrpx8\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.558700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ffd934-6139-4ef1-92b2-a30b7798fe61-operator-scripts\") pod \"glance-db-create-lrpx8\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.573978 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.575766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8zz\" (UniqueName: \"kubernetes.io/projected/f4ffd934-6139-4ef1-92b2-a30b7798fe61-kube-api-access-fc8zz\") pod \"glance-db-create-lrpx8\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.582584 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.612368 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.968969 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-fq8ft"] Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.970164 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.986967 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fq8ft"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.067476 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66980b23-7973-4558-91ba-6f53c2ad7046-operator-scripts\") pod \"keystone-db-create-fq8ft\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.067533 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvbc4\" (UniqueName: \"kubernetes.io/projected/66980b23-7973-4558-91ba-6f53c2ad7046-kube-api-access-zvbc4\") pod \"keystone-db-create-fq8ft\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.067581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:55 crc kubenswrapper[4722]: E0226 20:12:55.067744 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 20:12:55 crc kubenswrapper[4722]: E0226 20:12:55.067773 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 20:12:55 crc kubenswrapper[4722]: E0226 20:12:55.067836 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift podName:29033310-ec4f-49d0-8899-349e3c6b02f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:13:03.067817676 +0000 UTC m=+1125.604785600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift") pod "swift-storage-0" (UID: "29033310-ec4f-49d0-8899-349e3c6b02f9") : configmap "swift-ring-files" not found Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.074291 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b267-account-create-update-h956k"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.075724 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.080456 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.092375 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b267-account-create-update-h956k"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.167257 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-42ds6"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.168386 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.168657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66980b23-7973-4558-91ba-6f53c2ad7046-operator-scripts\") pod \"keystone-db-create-fq8ft\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.168699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvbc4\" (UniqueName: \"kubernetes.io/projected/66980b23-7973-4558-91ba-6f53c2ad7046-kube-api-access-zvbc4\") pod \"keystone-db-create-fq8ft\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.168961 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e110b2fa-c2a9-482e-9b60-8ca117d38d87-operator-scripts\") pod \"keystone-b267-account-create-update-h956k\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.169019 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfdr5\" (UniqueName: \"kubernetes.io/projected/e110b2fa-c2a9-482e-9b60-8ca117d38d87-kube-api-access-mfdr5\") pod \"keystone-b267-account-create-update-h956k\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.169606 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66980b23-7973-4558-91ba-6f53c2ad7046-operator-scripts\") pod \"keystone-db-create-fq8ft\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.177540 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-42ds6"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.195762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvbc4\" (UniqueName: \"kubernetes.io/projected/66980b23-7973-4558-91ba-6f53c2ad7046-kube-api-access-zvbc4\") pod \"keystone-db-create-fq8ft\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.271286 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfdr5\" (UniqueName: \"kubernetes.io/projected/e110b2fa-c2a9-482e-9b60-8ca117d38d87-kube-api-access-mfdr5\") pod \"keystone-b267-account-create-update-h956k\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.271623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb306548-9870-4ef0-ae38-af8d1edc3c3a-operator-scripts\") pod \"placement-db-create-42ds6\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.271687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79bn6\" (UniqueName: \"kubernetes.io/projected/cb306548-9870-4ef0-ae38-af8d1edc3c3a-kube-api-access-79bn6\") pod \"placement-db-create-42ds6\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.272018 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e110b2fa-c2a9-482e-9b60-8ca117d38d87-operator-scripts\") pod \"keystone-b267-account-create-update-h956k\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.272650 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e110b2fa-c2a9-482e-9b60-8ca117d38d87-operator-scripts\") pod \"keystone-b267-account-create-update-h956k\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.279948 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8121-account-create-update-lqcpn"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.281448 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.283511 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.288611 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfdr5\" (UniqueName: \"kubernetes.io/projected/e110b2fa-c2a9-482e-9b60-8ca117d38d87-kube-api-access-mfdr5\") pod \"keystone-b267-account-create-update-h956k\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.288768 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8121-account-create-update-lqcpn"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.291756 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.374298 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-operator-scripts\") pod \"placement-8121-account-create-update-lqcpn\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.374523 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb306548-9870-4ef0-ae38-af8d1edc3c3a-operator-scripts\") pod \"placement-db-create-42ds6\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.374576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vcc7\" (UniqueName: \"kubernetes.io/projected/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-kube-api-access-4vcc7\") pod \"placement-8121-account-create-update-lqcpn\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.374666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79bn6\" (UniqueName: \"kubernetes.io/projected/cb306548-9870-4ef0-ae38-af8d1edc3c3a-kube-api-access-79bn6\") pod \"placement-db-create-42ds6\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.376741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb306548-9870-4ef0-ae38-af8d1edc3c3a-operator-scripts\") pod \"placement-db-create-42ds6\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.398321 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.409547 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79bn6\" (UniqueName: \"kubernetes.io/projected/cb306548-9870-4ef0-ae38-af8d1edc3c3a-kube-api-access-79bn6\") pod \"placement-db-create-42ds6\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.477807 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vcc7\" (UniqueName: \"kubernetes.io/projected/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-kube-api-access-4vcc7\") pod \"placement-8121-account-create-update-lqcpn\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.479402 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-operator-scripts\") pod \"placement-8121-account-create-update-lqcpn\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.484431 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-operator-scripts\") pod \"placement-8121-account-create-update-lqcpn\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.489904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.505486 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vcc7\" (UniqueName: \"kubernetes.io/projected/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-kube-api-access-4vcc7\") pod \"placement-8121-account-create-update-lqcpn\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.639648 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.779437 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.918256 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-40c9-account-create-update-6b2zr"] Feb 26 20:12:55 crc kubenswrapper[4722]: W0226 20:12:55.924226 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12bb8485_56aa_436e_abd8_5e63601f2ab8.slice/crio-3e2f589fa66b84a3d48db3f43880bb84ea85b321f09e1a5bca32ef8b2eace119 WatchSource:0}: Error finding container 3e2f589fa66b84a3d48db3f43880bb84ea85b321f09e1a5bca32ef8b2eace119: Status 404 returned error can't find the container with id 3e2f589fa66b84a3d48db3f43880bb84ea85b321f09e1a5bca32ef8b2eace119 Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.006246 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fq8ft"] Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.011913 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.014582 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lrpx8"] Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.253097 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.260436 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b267-account-create-update-h956k"] Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.387687 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.445851 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8w24m"] Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.446119 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerName="dnsmasq-dns" containerID="cri-o://455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de" gracePeriod=10 Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.527955 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-42ds6"] Feb 26 20:12:56 crc kubenswrapper[4722]: W0226 20:12:56.532832 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb306548_9870_4ef0_ae38_af8d1edc3c3a.slice/crio-4785ba463a2a19802eee4ef60c316306c0370b5fb08a000748614611afc2aa4d WatchSource:0}: Error finding container 4785ba463a2a19802eee4ef60c316306c0370b5fb08a000748614611afc2aa4d: Status 404 returned error can't find the container with id 4785ba463a2a19802eee4ef60c316306c0370b5fb08a000748614611afc2aa4d Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.533266 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b267-account-create-update-h956k" event={"ID":"e110b2fa-c2a9-482e-9b60-8ca117d38d87","Type":"ContainerStarted","Data":"67bb871fe2c0b70e04092b359a25c14d4bdfc1004a082f03818935bdb2618fd8"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.540744 4722 generic.go:334] "Generic (PLEG): container finished" podID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerID="4b7de2619faa77e1eb7478bfe0b45934e7f95b49993cf881f49177297217f430" exitCode=0 Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.540803 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b02241f-513e-4558-b519-5bd84e5b4eff","Type":"ContainerDied","Data":"4b7de2619faa77e1eb7478bfe0b45934e7f95b49993cf881f49177297217f430"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.551961 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8121-account-create-update-lqcpn"] Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.553919 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e6617222-c81a-46cc-9c98-1170f7c89846","Type":"ContainerStarted","Data":"1478fa74c7d3ac1319ea01b47e6b8771ed24b3cc47e5513578cbb247ebf864ba"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.554598 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.572472 4722 generic.go:334] "Generic (PLEG): container finished" podID="a913d767-5243-448d-b5e9-6112a27b6233" containerID="43ea159df0e961d5bba20f73c2ccb16ed052423970ff1d6e49f9d35103353227" exitCode=0 Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.572592 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a913d767-5243-448d-b5e9-6112a27b6233","Type":"ContainerDied","Data":"43ea159df0e961d5bba20f73c2ccb16ed052423970ff1d6e49f9d35103353227"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.588976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerStarted","Data":"f6067e6fe27ccd897d0bc1a882d0b76219eff93755cafc77a0bda63cb0849470"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.596310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fq8ft" event={"ID":"66980b23-7973-4558-91ba-6f53c2ad7046","Type":"ContainerStarted","Data":"e81e55f98b8b52d71a6a650eb3fe444987918f2561df2ec8e1bb1ff8f1ffcb18"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.599501 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=19.790396771 podStartE2EDuration="51.59948389s" podCreationTimestamp="2026-02-26 20:12:05 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.638877062 +0000 UTC m=+1086.175844986" lastFinishedPulling="2026-02-26 20:12:55.447964181 +0000 UTC m=+1117.984932105" observedRunningTime="2026-02-26 20:12:56.5932011 +0000 UTC m=+1119.130169044" watchObservedRunningTime="2026-02-26 20:12:56.59948389 +0000 UTC m=+1119.136451824" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.610012 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-40c9-account-create-update-6b2zr" event={"ID":"12bb8485-56aa-436e-abd8-5e63601f2ab8","Type":"ContainerStarted","Data":"73385e0a6d5faee7dda2cbf3c7f647f0df7ca4be5684e9f26704a5d5c465e2d7"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.610405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-40c9-account-create-update-6b2zr" event={"ID":"12bb8485-56aa-436e-abd8-5e63601f2ab8","Type":"ContainerStarted","Data":"3e2f589fa66b84a3d48db3f43880bb84ea85b321f09e1a5bca32ef8b2eace119"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.625520 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lrpx8" event={"ID":"f4ffd934-6139-4ef1-92b2-a30b7798fe61","Type":"ContainerStarted","Data":"7d7ef49c942f5fadbfaab10390df7e8465fcd4567d0eefec4728f7c5afc748df"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.640133 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-40c9-account-create-update-6b2zr" podStartSLOduration=2.640110821 podStartE2EDuration="2.640110821s" podCreationTimestamp="2026-02-26 20:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:56.634431447 +0000 UTC m=+1119.171399371" watchObservedRunningTime="2026-02-26 20:12:56.640110821 +0000 UTC m=+1119.177078755" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.657812 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vfmbj" event={"ID":"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21","Type":"ContainerStarted","Data":"21f94f7de6c7b13a9694654244a990dee77fac0df30e50d1605c18353ae0f8ae"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.682438 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-lrpx8" podStartSLOduration=2.682416067 podStartE2EDuration="2.682416067s" podCreationTimestamp="2026-02-26 20:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:56.652069535 +0000 UTC m=+1119.189037469" watchObservedRunningTime="2026-02-26 20:12:56.682416067 +0000 UTC m=+1119.219383991" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.738318 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vfmbj" podStartSLOduration=2.976739746 podStartE2EDuration="9.73829647s" podCreationTimestamp="2026-02-26 20:12:47 +0000 UTC" firstStartedPulling="2026-02-26 20:12:48.663109977 +0000 UTC m=+1111.200077901" lastFinishedPulling="2026-02-26 20:12:55.424666701 +0000 UTC m=+1117.961634625" observedRunningTime="2026-02-26 20:12:56.679684163 +0000 UTC m=+1119.216652097" watchObservedRunningTime="2026-02-26 20:12:56.73829647 +0000 UTC m=+1119.275264414" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.994801 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="082c8f6a-a03f-4567-891c-56b6aa6f26d3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.051774 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.180778 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.269120 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.347237 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tvtz\" (UniqueName: \"kubernetes.io/projected/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-kube-api-access-2tvtz\") pod \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.347385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-config\") pod \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.347523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-dns-svc\") pod \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.365606 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-kube-api-access-2tvtz" (OuterVolumeSpecName: "kube-api-access-2tvtz") pod "7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" (UID: "7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8"). InnerVolumeSpecName "kube-api-access-2tvtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.396543 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-config" (OuterVolumeSpecName: "config") pod "7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" (UID: "7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.431780 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" (UID: "7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.450161 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.450825 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.450993 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tvtz\" (UniqueName: \"kubernetes.io/projected/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-kube-api-access-2tvtz\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.666177 4722 generic.go:334] "Generic (PLEG): container finished" podID="e110b2fa-c2a9-482e-9b60-8ca117d38d87" containerID="e9b886aa3352276ce6e04a2d381be311e3886f3dacfad947d148eba89f4cfc67" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.666234 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b267-account-create-update-h956k" event={"ID":"e110b2fa-c2a9-482e-9b60-8ca117d38d87","Type":"ContainerDied","Data":"e9b886aa3352276ce6e04a2d381be311e3886f3dacfad947d148eba89f4cfc67"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.669496 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b02241f-513e-4558-b519-5bd84e5b4eff","Type":"ContainerStarted","Data":"df270729411ef9e7833235443490c726efde57815635ab30de2a17139899505d"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.670455 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.673270 4722 generic.go:334] "Generic (PLEG): container finished" podID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerID="455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.673335 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.673338 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" event={"ID":"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8","Type":"ContainerDied","Data":"455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.673469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" event={"ID":"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8","Type":"ContainerDied","Data":"b868ba8dbefb0b180ae497fd7f691623d3a5f11135fcab3eb7559a8b9d396d3d"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.673503 4722 scope.go:117] "RemoveContainer" containerID="455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.675048 4722 generic.go:334] "Generic (PLEG): container finished" podID="cb306548-9870-4ef0-ae38-af8d1edc3c3a" containerID="f6de72bcdbf9ee781ec77b46bc1f5d6b13a76082e5b862f171620c00f731cba2" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.675117 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-42ds6" event={"ID":"cb306548-9870-4ef0-ae38-af8d1edc3c3a","Type":"ContainerDied","Data":"f6de72bcdbf9ee781ec77b46bc1f5d6b13a76082e5b862f171620c00f731cba2"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.675158 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-42ds6" event={"ID":"cb306548-9870-4ef0-ae38-af8d1edc3c3a","Type":"ContainerStarted","Data":"4785ba463a2a19802eee4ef60c316306c0370b5fb08a000748614611afc2aa4d"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.681709 4722 generic.go:334] "Generic (PLEG): container finished" podID="66980b23-7973-4558-91ba-6f53c2ad7046" containerID="0e957345181f767224843febfcb90e7ba6f6a6f89646a5c7d2e021dce436bbf2" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.681786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fq8ft" event={"ID":"66980b23-7973-4558-91ba-6f53c2ad7046","Type":"ContainerDied","Data":"0e957345181f767224843febfcb90e7ba6f6a6f89646a5c7d2e021dce436bbf2"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.683967 4722 generic.go:334] "Generic (PLEG): container finished" podID="7bdabe92-f114-4ce7-a52d-af8c640bf2ae" containerID="c1ecedd1e5644d22571990b546292504e5dec5b4f6c887aa8a5adff38a5a0fdd" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.684043 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8121-account-create-update-lqcpn" event={"ID":"7bdabe92-f114-4ce7-a52d-af8c640bf2ae","Type":"ContainerDied","Data":"c1ecedd1e5644d22571990b546292504e5dec5b4f6c887aa8a5adff38a5a0fdd"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.684072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8121-account-create-update-lqcpn" event={"ID":"7bdabe92-f114-4ce7-a52d-af8c640bf2ae","Type":"ContainerStarted","Data":"dc0385b74694fafc6b05fa3c1143b9eb275686769d036e239bb32bc0dc68cf7e"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.687108 4722 generic.go:334] "Generic (PLEG): container finished" podID="12bb8485-56aa-436e-abd8-5e63601f2ab8" containerID="73385e0a6d5faee7dda2cbf3c7f647f0df7ca4be5684e9f26704a5d5c465e2d7" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.687175 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-40c9-account-create-update-6b2zr" event={"ID":"12bb8485-56aa-436e-abd8-5e63601f2ab8","Type":"ContainerDied","Data":"73385e0a6d5faee7dda2cbf3c7f647f0df7ca4be5684e9f26704a5d5c465e2d7"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.690533 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4ffd934-6139-4ef1-92b2-a30b7798fe61" containerID="03c0e9cafbb16524123251a72faebfd56b790a7d3c3949a0898be78d71e46f98" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.690622 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lrpx8" event={"ID":"f4ffd934-6139-4ef1-92b2-a30b7798fe61","Type":"ContainerDied","Data":"03c0e9cafbb16524123251a72faebfd56b790a7d3c3949a0898be78d71e46f98"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.693790 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a913d767-5243-448d-b5e9-6112a27b6233","Type":"ContainerStarted","Data":"2e092e8d10162bdb0dd3f0ee5451b265ef3008a8fdd0ffdf127ad0130ba308a2"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.694329 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.718357 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.157216987 podStartE2EDuration="58.718338945s" podCreationTimestamp="2026-02-26 20:11:59 +0000 UTC" firstStartedPulling="2026-02-26 20:12:01.702712792 +0000 UTC m=+1064.239680716" lastFinishedPulling="2026-02-26 20:12:22.26383475 +0000 UTC m=+1084.800802674" observedRunningTime="2026-02-26 20:12:57.709740272 +0000 UTC m=+1120.246708216" watchObservedRunningTime="2026-02-26 20:12:57.718338945 +0000 UTC m=+1120.255306879" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.731569 4722 scope.go:117] "RemoveContainer" containerID="c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.788701 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8w24m"] Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.795802 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8w24m"] Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.821731 4722 scope.go:117] "RemoveContainer" containerID="455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de" Feb 26 20:12:57 crc kubenswrapper[4722]: E0226 20:12:57.827540 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de\": container with ID starting with 455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de not found: ID does not exist" containerID="455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.827599 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de"} err="failed to get container status \"455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de\": rpc error: code = NotFound desc = could not find container \"455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de\": container with ID starting with 455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de not found: ID does not exist" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.827627 4722 scope.go:117] "RemoveContainer" containerID="c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8" Feb 26 20:12:57 crc kubenswrapper[4722]: E0226 20:12:57.828058 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8\": container with ID starting with c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8 not found: ID does not exist" containerID="c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.828117 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8"} err="failed to get container status \"c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8\": rpc error: code = NotFound desc = could not find container \"c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8\": container with ID starting with c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8 not found: ID does not exist" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.849926 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.0234412 podStartE2EDuration="58.849888718s" podCreationTimestamp="2026-02-26 20:11:59 +0000 UTC" firstStartedPulling="2026-02-26 20:12:01.565255051 +0000 UTC m=+1064.102222975" lastFinishedPulling="2026-02-26 20:12:22.391702569 +0000 UTC m=+1084.928670493" observedRunningTime="2026-02-26 20:12:57.826256097 +0000 UTC m=+1120.363224041" watchObservedRunningTime="2026-02-26 20:12:57.849888718 +0000 UTC m=+1120.386856642" Feb 26 20:12:58 crc kubenswrapper[4722]: I0226 20:12:58.157246 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" path="/var/lib/kubelet/pods/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8/volumes" Feb 26 20:12:58 crc kubenswrapper[4722]: I0226 20:12:58.704056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerStarted","Data":"6c9d8d35fe3e1c07a31a86905b6ebe17bdcd42114cc8cce94f1b39c4a51a526b"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.093268 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.187325 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ffd934-6139-4ef1-92b2-a30b7798fe61-operator-scripts\") pod \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.187458 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8zz\" (UniqueName: \"kubernetes.io/projected/f4ffd934-6139-4ef1-92b2-a30b7798fe61-kube-api-access-fc8zz\") pod \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.192182 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4ffd934-6139-4ef1-92b2-a30b7798fe61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4ffd934-6139-4ef1-92b2-a30b7798fe61" (UID: "f4ffd934-6139-4ef1-92b2-a30b7798fe61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.195949 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4ffd934-6139-4ef1-92b2-a30b7798fe61-kube-api-access-fc8zz" (OuterVolumeSpecName: "kube-api-access-fc8zz") pod "f4ffd934-6139-4ef1-92b2-a30b7798fe61" (UID: "f4ffd934-6139-4ef1-92b2-a30b7798fe61"). InnerVolumeSpecName "kube-api-access-fc8zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.282710 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.287415 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.289114 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ffd934-6139-4ef1-92b2-a30b7798fe61-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.289177 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8zz\" (UniqueName: \"kubernetes.io/projected/f4ffd934-6139-4ef1-92b2-a30b7798fe61-kube-api-access-fc8zz\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.390958 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66980b23-7973-4558-91ba-6f53c2ad7046-operator-scripts\") pod \"66980b23-7973-4558-91ba-6f53c2ad7046\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.391156 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvbc4\" (UniqueName: \"kubernetes.io/projected/66980b23-7973-4558-91ba-6f53c2ad7046-kube-api-access-zvbc4\") pod \"66980b23-7973-4558-91ba-6f53c2ad7046\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.391235 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vftz2\" (UniqueName: \"kubernetes.io/projected/12bb8485-56aa-436e-abd8-5e63601f2ab8-kube-api-access-vftz2\") pod \"12bb8485-56aa-436e-abd8-5e63601f2ab8\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.391296 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12bb8485-56aa-436e-abd8-5e63601f2ab8-operator-scripts\") pod \"12bb8485-56aa-436e-abd8-5e63601f2ab8\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.391521 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66980b23-7973-4558-91ba-6f53c2ad7046-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66980b23-7973-4558-91ba-6f53c2ad7046" (UID: "66980b23-7973-4558-91ba-6f53c2ad7046"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.391783 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66980b23-7973-4558-91ba-6f53c2ad7046-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.392570 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12bb8485-56aa-436e-abd8-5e63601f2ab8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12bb8485-56aa-436e-abd8-5e63601f2ab8" (UID: "12bb8485-56aa-436e-abd8-5e63601f2ab8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.394456 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12bb8485-56aa-436e-abd8-5e63601f2ab8-kube-api-access-vftz2" (OuterVolumeSpecName: "kube-api-access-vftz2") pod "12bb8485-56aa-436e-abd8-5e63601f2ab8" (UID: "12bb8485-56aa-436e-abd8-5e63601f2ab8"). InnerVolumeSpecName "kube-api-access-vftz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.395587 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66980b23-7973-4558-91ba-6f53c2ad7046-kube-api-access-zvbc4" (OuterVolumeSpecName: "kube-api-access-zvbc4") pod "66980b23-7973-4558-91ba-6f53c2ad7046" (UID: "66980b23-7973-4558-91ba-6f53c2ad7046"). InnerVolumeSpecName "kube-api-access-zvbc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.461504 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.476616 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-42ds6" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.485214 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.494719 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvbc4\" (UniqueName: \"kubernetes.io/projected/66980b23-7973-4558-91ba-6f53c2ad7046-kube-api-access-zvbc4\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.494755 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vftz2\" (UniqueName: \"kubernetes.io/projected/12bb8485-56aa-436e-abd8-5e63601f2ab8-kube-api-access-vftz2\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.494766 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12bb8485-56aa-436e-abd8-5e63601f2ab8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596049 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e110b2fa-c2a9-482e-9b60-8ca117d38d87-operator-scripts\") pod \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596170 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfdr5\" (UniqueName: \"kubernetes.io/projected/e110b2fa-c2a9-482e-9b60-8ca117d38d87-kube-api-access-mfdr5\") pod \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596243 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb306548-9870-4ef0-ae38-af8d1edc3c3a-operator-scripts\") pod \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596282 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vcc7\" (UniqueName: \"kubernetes.io/projected/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-kube-api-access-4vcc7\") pod \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596325 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79bn6\" (UniqueName: \"kubernetes.io/projected/cb306548-9870-4ef0-ae38-af8d1edc3c3a-kube-api-access-79bn6\") pod \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596356 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-operator-scripts\") pod \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596493 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e110b2fa-c2a9-482e-9b60-8ca117d38d87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e110b2fa-c2a9-482e-9b60-8ca117d38d87" (UID: "e110b2fa-c2a9-482e-9b60-8ca117d38d87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596736 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb306548-9870-4ef0-ae38-af8d1edc3c3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb306548-9870-4ef0-ae38-af8d1edc3c3a" (UID: "cb306548-9870-4ef0-ae38-af8d1edc3c3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596758 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e110b2fa-c2a9-482e-9b60-8ca117d38d87-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.597165 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bdabe92-f114-4ce7-a52d-af8c640bf2ae" (UID: "7bdabe92-f114-4ce7-a52d-af8c640bf2ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.599275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb306548-9870-4ef0-ae38-af8d1edc3c3a-kube-api-access-79bn6" (OuterVolumeSpecName: "kube-api-access-79bn6") pod "cb306548-9870-4ef0-ae38-af8d1edc3c3a" (UID: "cb306548-9870-4ef0-ae38-af8d1edc3c3a"). InnerVolumeSpecName "kube-api-access-79bn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.599912 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-kube-api-access-4vcc7" (OuterVolumeSpecName: "kube-api-access-4vcc7") pod "7bdabe92-f114-4ce7-a52d-af8c640bf2ae" (UID: "7bdabe92-f114-4ce7-a52d-af8c640bf2ae"). InnerVolumeSpecName "kube-api-access-4vcc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.599910 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e110b2fa-c2a9-482e-9b60-8ca117d38d87-kube-api-access-mfdr5" (OuterVolumeSpecName: "kube-api-access-mfdr5") pod "e110b2fa-c2a9-482e-9b60-8ca117d38d87" (UID: "e110b2fa-c2a9-482e-9b60-8ca117d38d87"). InnerVolumeSpecName "kube-api-access-mfdr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.698344 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79bn6\" (UniqueName: \"kubernetes.io/projected/cb306548-9870-4ef0-ae38-af8d1edc3c3a-kube-api-access-79bn6\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.698383 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.698397 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfdr5\" (UniqueName: \"kubernetes.io/projected/e110b2fa-c2a9-482e-9b60-8ca117d38d87-kube-api-access-mfdr5\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.698408 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb306548-9870-4ef0-ae38-af8d1edc3c3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.698422 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vcc7\" (UniqueName: \"kubernetes.io/projected/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-kube-api-access-4vcc7\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.716650 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.717277 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lrpx8" event={"ID":"f4ffd934-6139-4ef1-92b2-a30b7798fe61","Type":"ContainerDied","Data":"7d7ef49c942f5fadbfaab10390df7e8465fcd4567d0eefec4728f7c5afc748df"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.717321 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d7ef49c942f5fadbfaab10390df7e8465fcd4567d0eefec4728f7c5afc748df" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.718280 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b267-account-create-update-h956k" event={"ID":"e110b2fa-c2a9-482e-9b60-8ca117d38d87","Type":"ContainerDied","Data":"67bb871fe2c0b70e04092b359a25c14d4bdfc1004a082f03818935bdb2618fd8"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.718300 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67bb871fe2c0b70e04092b359a25c14d4bdfc1004a082f03818935bdb2618fd8" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.718346 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.722649 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-42ds6" event={"ID":"cb306548-9870-4ef0-ae38-af8d1edc3c3a","Type":"ContainerDied","Data":"4785ba463a2a19802eee4ef60c316306c0370b5fb08a000748614611afc2aa4d"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.722689 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4785ba463a2a19802eee4ef60c316306c0370b5fb08a000748614611afc2aa4d" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.722705 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-42ds6" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.724001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fq8ft" event={"ID":"66980b23-7973-4558-91ba-6f53c2ad7046","Type":"ContainerDied","Data":"e81e55f98b8b52d71a6a650eb3fe444987918f2561df2ec8e1bb1ff8f1ffcb18"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.724021 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e81e55f98b8b52d71a6a650eb3fe444987918f2561df2ec8e1bb1ff8f1ffcb18" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.724226 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.724932 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8121-account-create-update-lqcpn" event={"ID":"7bdabe92-f114-4ce7-a52d-af8c640bf2ae","Type":"ContainerDied","Data":"dc0385b74694fafc6b05fa3c1143b9eb275686769d036e239bb32bc0dc68cf7e"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.724946 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.724951 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc0385b74694fafc6b05fa3c1143b9eb275686769d036e239bb32bc0dc68cf7e" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.726081 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-40c9-account-create-update-6b2zr" event={"ID":"12bb8485-56aa-436e-abd8-5e63601f2ab8","Type":"ContainerDied","Data":"3e2f589fa66b84a3d48db3f43880bb84ea85b321f09e1a5bca32ef8b2eace119"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.726109 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.726117 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e2f589fa66b84a3d48db3f43880bb84ea85b321f09e1a5bca32ef8b2eace119" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.082459 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5h8j5"] Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083591 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12bb8485-56aa-436e-abd8-5e63601f2ab8" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083608 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="12bb8485-56aa-436e-abd8-5e63601f2ab8" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083625 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ffd934-6139-4ef1-92b2-a30b7798fe61" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083631 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ffd934-6139-4ef1-92b2-a30b7798fe61" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083644 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bdabe92-f114-4ce7-a52d-af8c640bf2ae" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083650 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bdabe92-f114-4ce7-a52d-af8c640bf2ae" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083659 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66980b23-7973-4558-91ba-6f53c2ad7046" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083665 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="66980b23-7973-4558-91ba-6f53c2ad7046" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083677 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerName="dnsmasq-dns" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083684 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerName="dnsmasq-dns" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083698 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e110b2fa-c2a9-482e-9b60-8ca117d38d87" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083706 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e110b2fa-c2a9-482e-9b60-8ca117d38d87" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083724 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerName="init" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083730 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerName="init" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083742 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb306548-9870-4ef0-ae38-af8d1edc3c3a" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083750 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb306548-9870-4ef0-ae38-af8d1edc3c3a" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083894 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb306548-9870-4ef0-ae38-af8d1edc3c3a" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083908 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="66980b23-7973-4558-91ba-6f53c2ad7046" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083919 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="12bb8485-56aa-436e-abd8-5e63601f2ab8" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083939 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bdabe92-f114-4ce7-a52d-af8c640bf2ae" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083947 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerName="dnsmasq-dns" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083953 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e110b2fa-c2a9-482e-9b60-8ca117d38d87" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083961 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4ffd934-6139-4ef1-92b2-a30b7798fe61" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.084592 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.088551 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.099995 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5h8j5"] Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.226262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69ba9eec-3670-4f28-9d44-6356850f7e1b-operator-scripts\") pod \"root-account-create-update-5h8j5\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.226357 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxbcm\" (UniqueName: \"kubernetes.io/projected/69ba9eec-3670-4f28-9d44-6356850f7e1b-kube-api-access-nxbcm\") pod \"root-account-create-update-5h8j5\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.327928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69ba9eec-3670-4f28-9d44-6356850f7e1b-operator-scripts\") pod \"root-account-create-update-5h8j5\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.328814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69ba9eec-3670-4f28-9d44-6356850f7e1b-operator-scripts\") pod \"root-account-create-update-5h8j5\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.328916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbcm\" (UniqueName: \"kubernetes.io/projected/69ba9eec-3670-4f28-9d44-6356850f7e1b-kube-api-access-nxbcm\") pod \"root-account-create-update-5h8j5\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.353043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbcm\" (UniqueName: \"kubernetes.io/projected/69ba9eec-3670-4f28-9d44-6356850f7e1b-kube-api-access-nxbcm\") pod \"root-account-create-update-5h8j5\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.407463 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:02 crc kubenswrapper[4722]: W0226 20:13:02.063014 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69ba9eec_3670_4f28_9d44_6356850f7e1b.slice/crio-fc9cc300496ec0569d6f476b1cfb580c3e9c2255bdbcb08ad435fa2c9a030603 WatchSource:0}: Error finding container fc9cc300496ec0569d6f476b1cfb580c3e9c2255bdbcb08ad435fa2c9a030603: Status 404 returned error can't find the container with id fc9cc300496ec0569d6f476b1cfb580c3e9c2255bdbcb08ad435fa2c9a030603 Feb 26 20:13:02 crc kubenswrapper[4722]: I0226 20:13:02.063716 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5h8j5"] Feb 26 20:13:02 crc kubenswrapper[4722]: I0226 20:13:02.762191 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerStarted","Data":"eee5c986147f660fbb92c0df81e37a846731d3f87e8770f0e3727c6efa711261"} Feb 26 20:13:02 crc kubenswrapper[4722]: I0226 20:13:02.765443 4722 generic.go:334] "Generic (PLEG): container finished" podID="69ba9eec-3670-4f28-9d44-6356850f7e1b" containerID="fc1411365ef68c7f885a718434523637bdd447d960eca4fac57d8d2753da939b" exitCode=0 Feb 26 20:13:02 crc kubenswrapper[4722]: I0226 20:13:02.765497 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5h8j5" event={"ID":"69ba9eec-3670-4f28-9d44-6356850f7e1b","Type":"ContainerDied","Data":"fc1411365ef68c7f885a718434523637bdd447d960eca4fac57d8d2753da939b"} Feb 26 20:13:02 crc kubenswrapper[4722]: I0226 20:13:02.765525 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5h8j5" event={"ID":"69ba9eec-3670-4f28-9d44-6356850f7e1b","Type":"ContainerStarted","Data":"fc9cc300496ec0569d6f476b1cfb580c3e9c2255bdbcb08ad435fa2c9a030603"} Feb 26 20:13:02 crc kubenswrapper[4722]: I0226 20:13:02.821891 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.937476122 podStartE2EDuration="57.821871953s" podCreationTimestamp="2026-02-26 20:12:05 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.767374379 +0000 UTC m=+1086.304342303" lastFinishedPulling="2026-02-26 20:13:01.65177021 +0000 UTC m=+1124.188738134" observedRunningTime="2026-02-26 20:13:02.795995121 +0000 UTC m=+1125.332963065" watchObservedRunningTime="2026-02-26 20:13:02.821871953 +0000 UTC m=+1125.358839877" Feb 26 20:13:03 crc kubenswrapper[4722]: I0226 20:13:03.162970 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:13:03 crc kubenswrapper[4722]: I0226 20:13:03.182059 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:13:03 crc kubenswrapper[4722]: I0226 20:13:03.234937 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 26 20:13:03 crc kubenswrapper[4722]: I0226 20:13:03.397812 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:03.775042 4722 generic.go:334] "Generic (PLEG): container finished" podID="be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" containerID="21f94f7de6c7b13a9694654244a990dee77fac0df30e50d1605c18353ae0f8ae" exitCode=0 Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:03.775195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vfmbj" event={"ID":"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21","Type":"ContainerDied","Data":"21f94f7de6c7b13a9694654244a990dee77fac0df30e50d1605c18353ae0f8ae"} Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.386260 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-n5jvb"] Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.388033 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.392780 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.392998 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bxdpq" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.404729 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n5jvb"] Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.484889 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-db-sync-config-data\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.485345 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-config-data\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.485396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-combined-ca-bundle\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.485433 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvrds\" (UniqueName: \"kubernetes.io/projected/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-kube-api-access-jvrds\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.548000 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.586920 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-config-data\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.586985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-combined-ca-bundle\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.587013 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvrds\" (UniqueName: \"kubernetes.io/projected/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-kube-api-access-jvrds\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.587100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-db-sync-config-data\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.596234 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-config-data\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.596766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-combined-ca-bundle\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.596974 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-db-sync-config-data\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.611427 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.612031 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvrds\" (UniqueName: \"kubernetes.io/projected/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-kube-api-access-jvrds\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: W0226 20:13:04.621078 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29033310_ec4f_49d0_8899_349e3c6b02f9.slice/crio-8ae18e5377e16dfb69acc45b9114f3ffc5f54058648b825d85ea77319860e68b WatchSource:0}: Error finding container 8ae18e5377e16dfb69acc45b9114f3ffc5f54058648b825d85ea77319860e68b: Status 404 returned error can't find the container with id 8ae18e5377e16dfb69acc45b9114f3ffc5f54058648b825d85ea77319860e68b Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.687861 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxbcm\" (UniqueName: \"kubernetes.io/projected/69ba9eec-3670-4f28-9d44-6356850f7e1b-kube-api-access-nxbcm\") pod \"69ba9eec-3670-4f28-9d44-6356850f7e1b\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.687963 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69ba9eec-3670-4f28-9d44-6356850f7e1b-operator-scripts\") pod \"69ba9eec-3670-4f28-9d44-6356850f7e1b\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.689076 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69ba9eec-3670-4f28-9d44-6356850f7e1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69ba9eec-3670-4f28-9d44-6356850f7e1b" (UID: "69ba9eec-3670-4f28-9d44-6356850f7e1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.694344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ba9eec-3670-4f28-9d44-6356850f7e1b-kube-api-access-nxbcm" (OuterVolumeSpecName: "kube-api-access-nxbcm") pod "69ba9eec-3670-4f28-9d44-6356850f7e1b" (UID: "69ba9eec-3670-4f28-9d44-6356850f7e1b"). InnerVolumeSpecName "kube-api-access-nxbcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.713909 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.789931 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxbcm\" (UniqueName: \"kubernetes.io/projected/69ba9eec-3670-4f28-9d44-6356850f7e1b-kube-api-access-nxbcm\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.789958 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69ba9eec-3670-4f28-9d44-6356850f7e1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.806695 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5h8j5" event={"ID":"69ba9eec-3670-4f28-9d44-6356850f7e1b","Type":"ContainerDied","Data":"fc9cc300496ec0569d6f476b1cfb580c3e9c2255bdbcb08ad435fa2c9a030603"} Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.806972 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc9cc300496ec0569d6f476b1cfb580c3e9c2255bdbcb08ad435fa2c9a030603" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.806730 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.814407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"8ae18e5377e16dfb69acc45b9114f3ffc5f54058648b825d85ea77319860e68b"} Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.244885 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.298894 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-ring-data-devices\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.298959 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-etc-swift\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.298994 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-scripts\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.299028 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-combined-ca-bundle\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.299118 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbvbf\" (UniqueName: \"kubernetes.io/projected/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-kube-api-access-cbvbf\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.299159 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-swiftconf\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.299181 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-dispersionconf\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.300080 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.300468 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.330050 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.330189 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-kube-api-access-cbvbf" (OuterVolumeSpecName: "kube-api-access-cbvbf") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "kube-api-access-cbvbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.330731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-scripts" (OuterVolumeSpecName: "scripts") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.332769 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.339473 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n5jvb"] Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.340400 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: W0226 20:13:05.343415 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ff41abb_b86e_4d09_93e2_a6eb93d9fcdf.slice/crio-3f89cead2f4287d2653a27b8a6d1d3aabd53ed3fa926ed5b7520393f648a1a6d WatchSource:0}: Error finding container 3f89cead2f4287d2653a27b8a6d1d3aabd53ed3fa926ed5b7520393f648a1a6d: Status 404 returned error can't find the container with id 3f89cead2f4287d2653a27b8a6d1d3aabd53ed3fa926ed5b7520393f648a1a6d Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.401620 4722 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.401878 4722 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.401978 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.402052 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.402243 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbvbf\" (UniqueName: \"kubernetes.io/projected/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-kube-api-access-cbvbf\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.402325 4722 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.402394 4722 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.828395 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n5jvb" event={"ID":"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf","Type":"ContainerStarted","Data":"3f89cead2f4287d2653a27b8a6d1d3aabd53ed3fa926ed5b7520393f648a1a6d"} Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.830316 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vfmbj" event={"ID":"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21","Type":"ContainerDied","Data":"70f35b30d8f0ee722cbd2d642a2674e953088f43c6b4fd7d52bc9500b83ef9ce"} Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.830347 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70f35b30d8f0ee722cbd2d642a2674e953088f43c6b4fd7d52bc9500b83ef9ce" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.830366 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:13:06 crc kubenswrapper[4722]: I0226 20:13:06.093924 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 26 20:13:06 crc kubenswrapper[4722]: I0226 20:13:06.840466 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"ccf98c56630ffad30528c4f675c3c22c6d53958d27e60cac60df9a0301241d2c"} Feb 26 20:13:06 crc kubenswrapper[4722]: I0226 20:13:06.840512 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"7d53e4eaefbe67c71fb9144618c24ea342374fca7fffb2908d3033d5e7a6b3b9"} Feb 26 20:13:06 crc kubenswrapper[4722]: I0226 20:13:06.988623 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="082c8f6a-a03f-4567-891c-56b6aa6f26d3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.306815 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5h8j5"] Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.313763 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5h8j5"] Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.409525 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.409590 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.411482 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.852589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"17efb2ade3b49a0da34bc10ad8dabc866111c314fa466ef0bc92c700e9b099e6"} Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.852645 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"c91a2ba256c822a236065384ef255e4237cb30774e0544bc90ed16bce8828d47"} Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.853962 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:08 crc kubenswrapper[4722]: I0226 20:13:08.161574 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69ba9eec-3670-4f28-9d44-6356850f7e1b" path="/var/lib/kubelet/pods/69ba9eec-3670-4f28-9d44-6356850f7e1b/volumes" Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.877438 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"d7d1a8589e874ccc2d92990fa8e9cf2c975daa1759ec1eac483cd4cffdddf053"} Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.878022 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"46ff9ca52ec0c10b1d3b68a67cf37318910908d2e72580ab996c1d0c51bf4f2f"} Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.878039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"0029297d07064b72566d27dea4733bf08bb3e61f72753cb04bdecf0586505905"} Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.878051 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"cf3e41ddd6cf53a4b26d498aad3b24f4062820dde807f7d88d7046952c251b69"} Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.899890 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rsgbx" podUID="5c9c23c8-6fed-49f5-abe1-d44b885952ec" containerName="ovn-controller" probeResult="failure" output=< Feb 26 20:13:09 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 26 20:13:09 crc kubenswrapper[4722]: > Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.938863 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.939181 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="prometheus" containerID="cri-o://f6067e6fe27ccd897d0bc1a882d0b76219eff93755cafc77a0bda63cb0849470" gracePeriod=600 Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.939263 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="thanos-sidecar" containerID="cri-o://eee5c986147f660fbb92c0df81e37a846731d3f87e8770f0e3727c6efa711261" gracePeriod=600 Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.939268 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="config-reloader" containerID="cri-o://6c9d8d35fe3e1c07a31a86905b6ebe17bdcd42114cc8cce94f1b39c4a51a526b" gracePeriod=600 Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.858397 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.908920 4722 generic.go:334] "Generic (PLEG): container finished" podID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerID="eee5c986147f660fbb92c0df81e37a846731d3f87e8770f0e3727c6efa711261" exitCode=0 Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.908961 4722 generic.go:334] "Generic (PLEG): container finished" podID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerID="6c9d8d35fe3e1c07a31a86905b6ebe17bdcd42114cc8cce94f1b39c4a51a526b" exitCode=0 Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.908973 4722 generic.go:334] "Generic (PLEG): container finished" podID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerID="f6067e6fe27ccd897d0bc1a882d0b76219eff93755cafc77a0bda63cb0849470" exitCode=0 Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.908997 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerDied","Data":"eee5c986147f660fbb92c0df81e37a846731d3f87e8770f0e3727c6efa711261"} Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.909027 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerDied","Data":"6c9d8d35fe3e1c07a31a86905b6ebe17bdcd42114cc8cce94f1b39c4a51a526b"} Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.909042 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerDied","Data":"f6067e6fe27ccd897d0bc1a882d0b76219eff93755cafc77a0bda63cb0849470"} Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.171147 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.239639 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-gdd4v"] Feb 26 20:13:11 crc kubenswrapper[4722]: E0226 20:13:11.239999 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" containerName="swift-ring-rebalance" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.240018 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" containerName="swift-ring-rebalance" Feb 26 20:13:11 crc kubenswrapper[4722]: E0226 20:13:11.240041 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ba9eec-3670-4f28-9d44-6356850f7e1b" containerName="mariadb-account-create-update" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.240047 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ba9eec-3670-4f28-9d44-6356850f7e1b" containerName="mariadb-account-create-update" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.240237 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" containerName="swift-ring-rebalance" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.240267 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ba9eec-3670-4f28-9d44-6356850f7e1b" containerName="mariadb-account-create-update" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.240870 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.271305 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gdd4v"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.431482 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-974a-account-create-update-bszfn"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.432632 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.436121 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.452300 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsrht\" (UniqueName: \"kubernetes.io/projected/2842874a-dd3a-44ba-ba7e-e0d8f41be944-kube-api-access-jsrht\") pod \"cinder-db-create-gdd4v\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.452354 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2842874a-dd3a-44ba-ba7e-e0d8f41be944-operator-scripts\") pod \"cinder-db-create-gdd4v\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.477006 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-974a-account-create-update-bszfn"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.554871 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsrht\" (UniqueName: \"kubernetes.io/projected/2842874a-dd3a-44ba-ba7e-e0d8f41be944-kube-api-access-jsrht\") pod \"cinder-db-create-gdd4v\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.554921 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2842874a-dd3a-44ba-ba7e-e0d8f41be944-operator-scripts\") pod \"cinder-db-create-gdd4v\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.554944 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv5xk\" (UniqueName: \"kubernetes.io/projected/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-kube-api-access-jv5xk\") pod \"cinder-974a-account-create-update-bszfn\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.555007 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-operator-scripts\") pod \"cinder-974a-account-create-update-bszfn\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.556534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2842874a-dd3a-44ba-ba7e-e0d8f41be944-operator-scripts\") pod \"cinder-db-create-gdd4v\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.586222 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-xkflz"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.588623 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.629786 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsrht\" (UniqueName: \"kubernetes.io/projected/2842874a-dd3a-44ba-ba7e-e0d8f41be944-kube-api-access-jsrht\") pod \"cinder-db-create-gdd4v\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.634831 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-xkflz"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.657199 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-operator-scripts\") pod \"cinder-974a-account-create-update-bszfn\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.657373 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv5xk\" (UniqueName: \"kubernetes.io/projected/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-kube-api-access-jv5xk\") pod \"cinder-974a-account-create-update-bszfn\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.657988 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-operator-scripts\") pod \"cinder-974a-account-create-update-bszfn\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.702091 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv5xk\" (UniqueName: \"kubernetes.io/projected/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-kube-api-access-jv5xk\") pod \"cinder-974a-account-create-update-bszfn\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.759073 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8205614-2f8f-4d32-8522-e76f6e7b9c69-operator-scripts\") pod \"cloudkitty-db-create-xkflz\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.759387 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjt7w\" (UniqueName: \"kubernetes.io/projected/d8205614-2f8f-4d32-8522-e76f6e7b9c69-kube-api-access-qjt7w\") pod \"cloudkitty-db-create-xkflz\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.761973 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-qtmxl"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.763200 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.770838 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.774861 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qtmxl"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.861125 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.861822 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8205614-2f8f-4d32-8522-e76f6e7b9c69-operator-scripts\") pod \"cloudkitty-db-create-xkflz\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.861881 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4091b496-0010-42d3-97d6-281d47ae3f1c-operator-scripts\") pod \"barbican-db-create-qtmxl\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.861927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjt7w\" (UniqueName: \"kubernetes.io/projected/d8205614-2f8f-4d32-8522-e76f6e7b9c69-kube-api-access-qjt7w\") pod \"cloudkitty-db-create-xkflz\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.861982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxktb\" (UniqueName: \"kubernetes.io/projected/4091b496-0010-42d3-97d6-281d47ae3f1c-kube-api-access-zxktb\") pod \"barbican-db-create-qtmxl\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.863017 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8205614-2f8f-4d32-8522-e76f6e7b9c69-operator-scripts\") pod \"cloudkitty-db-create-xkflz\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.871843 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-0ff4-account-create-update-t2c7j"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.877088 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.884267 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.898433 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjt7w\" (UniqueName: \"kubernetes.io/projected/d8205614-2f8f-4d32-8522-e76f6e7b9c69-kube-api-access-qjt7w\") pod \"cloudkitty-db-create-xkflz\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.914115 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-0ff4-account-create-update-t2c7j"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.926266 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-x7zlz"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.928446 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.935486 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.936486 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.936653 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v8sf5" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.936761 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.943261 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-x7zlz"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.957451 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3385-account-create-update-qdqpt"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.959505 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.963186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcbxw\" (UniqueName: \"kubernetes.io/projected/50d98fd3-85f9-400a-9492-7add2a485d7c-kube-api-access-xcbxw\") pod \"cloudkitty-0ff4-account-create-update-t2c7j\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.963241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d98fd3-85f9-400a-9492-7add2a485d7c-operator-scripts\") pod \"cloudkitty-0ff4-account-create-update-t2c7j\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.963276 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4091b496-0010-42d3-97d6-281d47ae3f1c-operator-scripts\") pod \"barbican-db-create-qtmxl\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.963353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxktb\" (UniqueName: \"kubernetes.io/projected/4091b496-0010-42d3-97d6-281d47ae3f1c-kube-api-access-zxktb\") pod \"barbican-db-create-qtmxl\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.964730 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4091b496-0010-42d3-97d6-281d47ae3f1c-operator-scripts\") pod \"barbican-db-create-qtmxl\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.967021 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.969396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.974032 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3385-account-create-update-qdqpt"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.990064 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxktb\" (UniqueName: \"kubernetes.io/projected/4091b496-0010-42d3-97d6-281d47ae3f1c-kube-api-access-zxktb\") pod \"barbican-db-create-qtmxl\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.002183 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-667ht"] Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.003537 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.039570 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-aee4-account-create-update-pdt89"] Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.041186 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.043586 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.056398 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-667ht"] Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.064532 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-aee4-account-create-update-pdt89"] Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.065676 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjc29\" (UniqueName: \"kubernetes.io/projected/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-kube-api-access-rjc29\") pod \"barbican-3385-account-create-update-qdqpt\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.065744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d98fd3-85f9-400a-9492-7add2a485d7c-operator-scripts\") pod \"cloudkitty-0ff4-account-create-update-t2c7j\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.065811 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldrrf\" (UniqueName: \"kubernetes.io/projected/64b602b0-4c3e-4f7b-a1e8-961510e33097-kube-api-access-ldrrf\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.065907 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-config-data\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.065953 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-operator-scripts\") pod \"barbican-3385-account-create-update-qdqpt\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.066027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-combined-ca-bundle\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.066102 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcbxw\" (UniqueName: \"kubernetes.io/projected/50d98fd3-85f9-400a-9492-7add2a485d7c-kube-api-access-xcbxw\") pod \"cloudkitty-0ff4-account-create-update-t2c7j\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.066900 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d98fd3-85f9-400a-9492-7add2a485d7c-operator-scripts\") pod \"cloudkitty-0ff4-account-create-update-t2c7j\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.088468 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcbxw\" (UniqueName: \"kubernetes.io/projected/50d98fd3-85f9-400a-9492-7add2a485d7c-kube-api-access-xcbxw\") pod \"cloudkitty-0ff4-account-create-update-t2c7j\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.167301 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-config-data\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.167348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d56edfd6-ff9d-4a81-820c-250a94048683-operator-scripts\") pod \"neutron-aee4-account-create-update-pdt89\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.167381 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-operator-scripts\") pod \"barbican-3385-account-create-update-qdqpt\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.167424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d8tl\" (UniqueName: \"kubernetes.io/projected/d56edfd6-ff9d-4a81-820c-250a94048683-kube-api-access-8d8tl\") pod \"neutron-aee4-account-create-update-pdt89\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.167445 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-combined-ca-bundle\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.167822 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjc29\" (UniqueName: \"kubernetes.io/projected/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-kube-api-access-rjc29\") pod \"barbican-3385-account-create-update-qdqpt\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.168709 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-operator-scripts\") pod \"barbican-3385-account-create-update-qdqpt\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.170965 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3059b1f6-b323-4632-8296-c4eec81bb239-operator-scripts\") pod \"neutron-db-create-667ht\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.171068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxdxm\" (UniqueName: \"kubernetes.io/projected/3059b1f6-b323-4632-8296-c4eec81bb239-kube-api-access-mxdxm\") pod \"neutron-db-create-667ht\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.171169 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldrrf\" (UniqueName: \"kubernetes.io/projected/64b602b0-4c3e-4f7b-a1e8-961510e33097-kube-api-access-ldrrf\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.171302 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-combined-ca-bundle\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.172562 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-config-data\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.185395 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjc29\" (UniqueName: \"kubernetes.io/projected/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-kube-api-access-rjc29\") pod \"barbican-3385-account-create-update-qdqpt\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.187385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldrrf\" (UniqueName: \"kubernetes.io/projected/64b602b0-4c3e-4f7b-a1e8-961510e33097-kube-api-access-ldrrf\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.190320 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.263214 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.272278 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxdxm\" (UniqueName: \"kubernetes.io/projected/3059b1f6-b323-4632-8296-c4eec81bb239-kube-api-access-mxdxm\") pod \"neutron-db-create-667ht\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.272387 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d56edfd6-ff9d-4a81-820c-250a94048683-operator-scripts\") pod \"neutron-aee4-account-create-update-pdt89\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.272441 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d8tl\" (UniqueName: \"kubernetes.io/projected/d56edfd6-ff9d-4a81-820c-250a94048683-kube-api-access-8d8tl\") pod \"neutron-aee4-account-create-update-pdt89\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.272507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3059b1f6-b323-4632-8296-c4eec81bb239-operator-scripts\") pod \"neutron-db-create-667ht\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.273153 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3059b1f6-b323-4632-8296-c4eec81bb239-operator-scripts\") pod \"neutron-db-create-667ht\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.273737 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d56edfd6-ff9d-4a81-820c-250a94048683-operator-scripts\") pod \"neutron-aee4-account-create-update-pdt89\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.276208 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.285458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.288871 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d8tl\" (UniqueName: \"kubernetes.io/projected/d56edfd6-ff9d-4a81-820c-250a94048683-kube-api-access-8d8tl\") pod \"neutron-aee4-account-create-update-pdt89\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.289905 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxdxm\" (UniqueName: \"kubernetes.io/projected/3059b1f6-b323-4632-8296-c4eec81bb239-kube-api-access-mxdxm\") pod \"neutron-db-create-667ht\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.345429 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cg47w"] Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.348935 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.349316 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.354844 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.363324 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cg47w"] Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.377540 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.477347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2de5980-b357-42e1-8630-ea5b2751f224-operator-scripts\") pod \"root-account-create-update-cg47w\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.477473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc4vj\" (UniqueName: \"kubernetes.io/projected/e2de5980-b357-42e1-8630-ea5b2751f224-kube-api-access-tc4vj\") pod \"root-account-create-update-cg47w\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.579383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2de5980-b357-42e1-8630-ea5b2751f224-operator-scripts\") pod \"root-account-create-update-cg47w\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.579488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc4vj\" (UniqueName: \"kubernetes.io/projected/e2de5980-b357-42e1-8630-ea5b2751f224-kube-api-access-tc4vj\") pod \"root-account-create-update-cg47w\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.580827 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2de5980-b357-42e1-8630-ea5b2751f224-operator-scripts\") pod \"root-account-create-update-cg47w\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.602928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc4vj\" (UniqueName: \"kubernetes.io/projected/e2de5980-b357-42e1-8630-ea5b2751f224-kube-api-access-tc4vj\") pod \"root-account-create-update-cg47w\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.668233 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:14 crc kubenswrapper[4722]: I0226 20:13:14.897080 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rsgbx" podUID="5c9c23c8-6fed-49f5-abe1-d44b885952ec" containerName="ovn-controller" probeResult="failure" output=< Feb 26 20:13:14 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 26 20:13:14 crc kubenswrapper[4722]: > Feb 26 20:13:14 crc kubenswrapper[4722]: I0226 20:13:14.985643 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:13:14 crc kubenswrapper[4722]: I0226 20:13:14.991687 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.212538 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rsgbx-config-xbbk8"] Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.213839 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.215547 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.219771 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rsgbx-config-xbbk8"] Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.354099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-scripts\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.354176 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-log-ovn\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.354198 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run-ovn\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.354740 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/ecc9c4b4-0f7b-4309-aca4-57e977029936-kube-api-access-6kfdw\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.354869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-additional-scripts\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.355012 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.409922 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.117:9090/-/ready\": dial tcp 10.217.0.117:9090: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456321 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456430 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-scripts\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456479 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-log-ovn\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456498 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run-ovn\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456552 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/ecc9c4b4-0f7b-4309-aca4-57e977029936-kube-api-access-6kfdw\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-additional-scripts\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456697 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456724 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run-ovn\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456748 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-log-ovn\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.457423 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-additional-scripts\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.458651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-scripts\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.482690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/ecc9c4b4-0f7b-4309-aca4-57e977029936-kube-api-access-6kfdw\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.575112 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:16 crc kubenswrapper[4722]: I0226 20:13:16.987754 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="082c8f6a-a03f-4567-891c-56b6aa6f26d3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 20:13:18 crc kubenswrapper[4722]: I0226 20:13:18.935729 4722 scope.go:117] "RemoveContainer" containerID="1f34805f891bdef575a93bdd795f3e9cbcb41a3be9f3e37998f1db71c779fd63" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.053656 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rsgbx" podUID="5c9c23c8-6fed-49f5-abe1-d44b885952ec" containerName="ovn-controller" probeResult="failure" output=< Feb 26 20:13:20 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 26 20:13:20 crc kubenswrapper[4722]: > Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.409515 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.117:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.746730 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.760022 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-thanos-prometheus-http-client-file\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.760100 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94e2a737-a422-4ef4-9394-324953ef1ff2-config-out\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.760162 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-0\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.761045 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.762741 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.768503 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.768577 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94e2a737-a422-4ef4-9394-324953ef1ff2-config-out" (OuterVolumeSpecName: "config-out") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.863579 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-config\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.863637 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjgf9\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-kube-api-access-cjgf9\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.863671 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-1\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.863711 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-tls-assets\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.863738 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-web-config\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.863977 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.864236 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-2\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.864302 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.864642 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.865186 4722 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.865271 4722 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94e2a737-a422-4ef4-9394-324953ef1ff2-config-out\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.865335 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.865394 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.866448 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-config" (OuterVolumeSpecName: "config") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.869668 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.870850 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-kube-api-access-cjgf9" (OuterVolumeSpecName: "kube-api-access-cjgf9") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "kube-api-access-cjgf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.886895 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-web-config" (OuterVolumeSpecName: "web-config") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.899997 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "pvc-3695ba2b-30e0-4cee-b990-4eee300994f3". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.970975 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.971485 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjgf9\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-kube-api-access-cjgf9\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.971642 4722 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.971722 4722 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-web-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.971848 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") on node \"crc\" " Feb 26 20:13:20 crc kubenswrapper[4722]: E0226 20:13:20.984713 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Feb 26 20:13:20 crc kubenswrapper[4722]: E0226 20:13:20.984995 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jvrds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-n5jvb_openstack(4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:13:20 crc kubenswrapper[4722]: E0226 20:13:20.986288 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-n5jvb" podUID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.009285 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.010122 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3695ba2b-30e0-4cee-b990-4eee300994f3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3") on node "crc" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.029968 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerDied","Data":"91652176d6f022428384f101195d66eabb6874b54c5593eced205de3eaa53d04"} Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.030018 4722 scope.go:117] "RemoveContainer" containerID="eee5c986147f660fbb92c0df81e37a846731d3f87e8770f0e3727c6efa711261" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.030279 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.037877 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-n5jvb" podUID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.073652 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.082366 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.096899 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.109578 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.110013 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="init-config-reloader" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110031 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="init-config-reloader" Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.110056 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="config-reloader" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110063 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="config-reloader" Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.110074 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="prometheus" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110081 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="prometheus" Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.110099 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="thanos-sidecar" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110105 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="thanos-sidecar" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110294 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="thanos-sidecar" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110316 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="config-reloader" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110338 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="prometheus" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.112244 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.116106 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.116312 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-z8rrv" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.116475 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.116648 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.116882 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.117039 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.117284 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.117572 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.121737 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.127450 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.278340 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.278991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-config\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279120 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/751959d7-d249-457b-896e-fbc800f4d2bf-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgtsx\" (UniqueName: \"kubernetes.io/projected/751959d7-d249-457b-896e-fbc800f4d2bf-kube-api-access-vgtsx\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279383 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/751959d7-d249-457b-896e-fbc800f4d2bf-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279472 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279499 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279527 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381359 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381421 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381472 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/751959d7-d249-457b-896e-fbc800f4d2bf-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381548 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381571 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381606 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381791 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-config\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/751959d7-d249-457b-896e-fbc800f4d2bf-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382526 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382562 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgtsx\" (UniqueName: \"kubernetes.io/projected/751959d7-d249-457b-896e-fbc800f4d2bf-kube-api-access-vgtsx\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382651 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382426 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382927 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.385511 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.385557 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/751959d7-d249-457b-896e-fbc800f4d2bf-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.385871 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.385919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/751959d7-d249-457b-896e-fbc800f4d2bf-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.386275 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.386300 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2afc96fa7f9c378e63298d168f739061cadeeb81c2b7504ca3dad6d4afb5d2c4/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.386934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-config\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.387693 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.388924 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.389641 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.401239 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgtsx\" (UniqueName: \"kubernetes.io/projected/751959d7-d249-457b-896e-fbc800f4d2bf-kube-api-access-vgtsx\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.425219 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.441500 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.626073 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-swift-object:current-podified" Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.626240 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:object-server,Image:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,Command:[/usr/bin/swift-object-server /etc/swift/object-server.conf.d -v],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:object,HostPort:0,ContainerPort:6200,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc6h57fh675h5fbh594h57ch569h66dh58bh679h8dh5bch545h57dhddh57ch687h54fhb7h85h9fh668hd5h4h665h8fh66fh9dh544h56bh5f9h4q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:swift,ReadOnly:false,MountPath:/srv/node/pv,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-swift,ReadOnly:false,MountPath:/etc/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cache,ReadOnly:false,MountPath:/var/cache/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lock,ReadOnly:false,MountPath:/var/lock,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zqn8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42445,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-storage-0_openstack(29033310-ec4f-49d0-8899-349e3c6b02f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.662630 4722 scope.go:117] "RemoveContainer" containerID="6c9d8d35fe3e1c07a31a86905b6ebe17bdcd42114cc8cce94f1b39c4a51a526b" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.925075 4722 scope.go:117] "RemoveContainer" containerID="f6067e6fe27ccd897d0bc1a882d0b76219eff93755cafc77a0bda63cb0849470" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.975060 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-aee4-account-create-update-pdt89"] Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.993853 4722 scope.go:117] "RemoveContainer" containerID="14fbb8b26e4f9d83af6cec452e3f2c248bbf5a480b7ca6fc07aadd400140ba7b" Feb 26 20:13:22 crc kubenswrapper[4722]: W0226 20:13:22.021276 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd56edfd6_ff9d_4a81_820c_250a94048683.slice/crio-9378b8d0e94b6e053bd2504ffcad37e4d06cdc9f2964dd97ad960e963e3c81b1 WatchSource:0}: Error finding container 9378b8d0e94b6e053bd2504ffcad37e4d06cdc9f2964dd97ad960e963e3c81b1: Status 404 returned error can't find the container with id 9378b8d0e94b6e053bd2504ffcad37e4d06cdc9f2964dd97ad960e963e3c81b1 Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.042739 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.055537 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-aee4-account-create-update-pdt89" event={"ID":"d56edfd6-ff9d-4a81-820c-250a94048683","Type":"ContainerStarted","Data":"9378b8d0e94b6e053bd2504ffcad37e4d06cdc9f2964dd97ad960e963e3c81b1"} Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.164735 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" path="/var/lib/kubelet/pods/94e2a737-a422-4ef4-9394-324953ef1ff2/volumes" Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.268901 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qtmxl"] Feb 26 20:13:22 crc kubenswrapper[4722]: E0226 20:13:22.338167 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"object-server\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"object-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-updater\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"rsync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"swift-recon-cron\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="29033310-ec4f-49d0-8899-349e3c6b02f9" Feb 26 20:13:22 crc kubenswrapper[4722]: W0226 20:13:22.425993 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3059b1f6_b323_4632_8296_c4eec81bb239.slice/crio-ed1694aa86a37d8aec36b065e191bb50cfb3312bbbc66574cfeb4c1fb521d811 WatchSource:0}: Error finding container ed1694aa86a37d8aec36b065e191bb50cfb3312bbbc66574cfeb4c1fb521d811: Status 404 returned error can't find the container with id ed1694aa86a37d8aec36b065e191bb50cfb3312bbbc66574cfeb4c1fb521d811 Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.427681 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gdd4v"] Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.437820 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-667ht"] Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.463202 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-974a-account-create-update-bszfn"] Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.469871 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.473147 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-xkflz"] Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.822309 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-0ff4-account-create-update-t2c7j"] Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.831081 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cg47w"] Feb 26 20:13:22 crc kubenswrapper[4722]: W0226 20:13:22.836635 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50d98fd3_85f9_400a_9492_7add2a485d7c.slice/crio-1879d1fbea64a77948b985c4a3c007e994b8b8ce610df3a49ff6298e6c844ec6 WatchSource:0}: Error finding container 1879d1fbea64a77948b985c4a3c007e994b8b8ce610df3a49ff6298e6c844ec6: Status 404 returned error can't find the container with id 1879d1fbea64a77948b985c4a3c007e994b8b8ce610df3a49ff6298e6c844ec6 Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.840436 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:13:22 crc kubenswrapper[4722]: W0226 20:13:22.846794 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod751959d7_d249_457b_896e_fbc800f4d2bf.slice/crio-cf6c0403a61823528cf75d9c45058f6bb2092fc769c1f1276dad9104a78587d0 WatchSource:0}: Error finding container cf6c0403a61823528cf75d9c45058f6bb2092fc769c1f1276dad9104a78587d0: Status 404 returned error can't find the container with id cf6c0403a61823528cf75d9c45058f6bb2092fc769c1f1276dad9104a78587d0 Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.851333 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.851553 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.853015 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-x7zlz"] Feb 26 20:13:22 crc kubenswrapper[4722]: W0226 20:13:22.861371 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4315c1e_5007_4f92_b729_ac02cfdbc2ce.slice/crio-f88cc916265770ff1705933b300f3f9614d14157aa1042bc2beec9fcce020043 WatchSource:0}: Error finding container f88cc916265770ff1705933b300f3f9614d14157aa1042bc2beec9fcce020043: Status 404 returned error can't find the container with id f88cc916265770ff1705933b300f3f9614d14157aa1042bc2beec9fcce020043 Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.869164 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3385-account-create-update-qdqpt"] Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.879815 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rsgbx-config-xbbk8"] Feb 26 20:13:22 crc kubenswrapper[4722]: W0226 20:13:22.888295 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecc9c4b4_0f7b_4309_aca4_57e977029936.slice/crio-cf36792a82d4e150706f8a645466b05d1fddac8b8fccbfdc867c2192d02f00ed WatchSource:0}: Error finding container cf36792a82d4e150706f8a645466b05d1fddac8b8fccbfdc867c2192d02f00ed: Status 404 returned error can't find the container with id cf36792a82d4e150706f8a645466b05d1fddac8b8fccbfdc867c2192d02f00ed Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.890893 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.067273 4722 generic.go:334] "Generic (PLEG): container finished" podID="484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" containerID="30472cdf700f912bc5dcbe8f1046acb1daf64fba8373c1aa6e470fc71c0efe67" exitCode=0 Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.067337 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-974a-account-create-update-bszfn" event={"ID":"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5","Type":"ContainerDied","Data":"30472cdf700f912bc5dcbe8f1046acb1daf64fba8373c1aa6e470fc71c0efe67"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.067929 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-974a-account-create-update-bszfn" event={"ID":"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5","Type":"ContainerStarted","Data":"948bc5522c81a1953de7ef79a5d0cf9753d012b607af6cfa157a7a5d138093ff"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.083179 4722 generic.go:334] "Generic (PLEG): container finished" podID="3059b1f6-b323-4632-8296-c4eec81bb239" containerID="85e132ee56a366791bfb2a9d37f666669efa2791c2925f5341f7ea54f6cbacb3" exitCode=0 Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.083261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-667ht" event={"ID":"3059b1f6-b323-4632-8296-c4eec81bb239","Type":"ContainerDied","Data":"85e132ee56a366791bfb2a9d37f666669efa2791c2925f5341f7ea54f6cbacb3"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.083335 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-667ht" event={"ID":"3059b1f6-b323-4632-8296-c4eec81bb239","Type":"ContainerStarted","Data":"ed1694aa86a37d8aec36b065e191bb50cfb3312bbbc66574cfeb4c1fb521d811"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.085416 4722 generic.go:334] "Generic (PLEG): container finished" podID="d8205614-2f8f-4d32-8522-e76f6e7b9c69" containerID="ac6fe4771c4ff85450d9e825c5b8afe616d23af31beaceaa0f7ed78aeb8a2a1d" exitCode=0 Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.085459 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-xkflz" event={"ID":"d8205614-2f8f-4d32-8522-e76f6e7b9c69","Type":"ContainerDied","Data":"ac6fe4771c4ff85450d9e825c5b8afe616d23af31beaceaa0f7ed78aeb8a2a1d"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.085498 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-xkflz" event={"ID":"d8205614-2f8f-4d32-8522-e76f6e7b9c69","Type":"ContainerStarted","Data":"abb57b19ed740c5685e6909a23bc946396d5b69c64f92cb19126cee5a4047050"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.090054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"a3ab234b3c06f427fed0cd203c103f1cbfcd9676884f625650763401bc38370e"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.094125 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cg47w" event={"ID":"e2de5980-b357-42e1-8630-ea5b2751f224","Type":"ContainerStarted","Data":"bfb9418238fc69cdfc530dfba02afc1ffceae9f26085ae60481e4bc59a7b8f26"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.095247 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" event={"ID":"50d98fd3-85f9-400a-9492-7add2a485d7c","Type":"ContainerStarted","Data":"1879d1fbea64a77948b985c4a3c007e994b8b8ce610df3a49ff6298e6c844ec6"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.096826 4722 generic.go:334] "Generic (PLEG): container finished" podID="d56edfd6-ff9d-4a81-820c-250a94048683" containerID="116a15c78f253ff12eb03dc128c2c8826ff24bd684f260eefceffd74fb2de9a5" exitCode=0 Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.096877 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-aee4-account-create-update-pdt89" event={"ID":"d56edfd6-ff9d-4a81-820c-250a94048683","Type":"ContainerDied","Data":"116a15c78f253ff12eb03dc128c2c8826ff24bd684f260eefceffd74fb2de9a5"} Feb 26 20:13:23 crc kubenswrapper[4722]: E0226 20:13:23.097078 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"object-server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-updater\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"rsync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"swift-recon-cron\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="29033310-ec4f-49d0-8899-349e3c6b02f9" Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.097892 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3385-account-create-update-qdqpt" event={"ID":"e4315c1e-5007-4f92-b729-ac02cfdbc2ce","Type":"ContainerStarted","Data":"f88cc916265770ff1705933b300f3f9614d14157aa1042bc2beec9fcce020043"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.098688 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"751959d7-d249-457b-896e-fbc800f4d2bf","Type":"ContainerStarted","Data":"cf6c0403a61823528cf75d9c45058f6bb2092fc769c1f1276dad9104a78587d0"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.099929 4722 generic.go:334] "Generic (PLEG): container finished" podID="4091b496-0010-42d3-97d6-281d47ae3f1c" containerID="41d31fbcb037a00808ab448efcc9a72df78355f794fcbf9f3f37698a4a78afa6" exitCode=0 Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.099986 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qtmxl" event={"ID":"4091b496-0010-42d3-97d6-281d47ae3f1c","Type":"ContainerDied","Data":"41d31fbcb037a00808ab448efcc9a72df78355f794fcbf9f3f37698a4a78afa6"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.100005 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qtmxl" event={"ID":"4091b496-0010-42d3-97d6-281d47ae3f1c","Type":"ContainerStarted","Data":"773c5aa5986ad60c4c0dcff86df70091bac9caa541fe4636df0b76a6c845ce1b"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.103948 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x7zlz" event={"ID":"64b602b0-4c3e-4f7b-a1e8-961510e33097","Type":"ContainerStarted","Data":"9726f63a0e6edb0ffd84cee7004d452124571011ae6351565ba9ef74412889e8"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.107378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rsgbx-config-xbbk8" event={"ID":"ecc9c4b4-0f7b-4309-aca4-57e977029936","Type":"ContainerStarted","Data":"cf36792a82d4e150706f8a645466b05d1fddac8b8fccbfdc867c2192d02f00ed"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.119499 4722 generic.go:334] "Generic (PLEG): container finished" podID="2842874a-dd3a-44ba-ba7e-e0d8f41be944" containerID="fb6a21fe7ab70b142c6303b02630080c20f07f7547173986813cdd17ce919c8b" exitCode=0 Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.119556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gdd4v" event={"ID":"2842874a-dd3a-44ba-ba7e-e0d8f41be944","Type":"ContainerDied","Data":"fb6a21fe7ab70b142c6303b02630080c20f07f7547173986813cdd17ce919c8b"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.119584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gdd4v" event={"ID":"2842874a-dd3a-44ba-ba7e-e0d8f41be944","Type":"ContainerStarted","Data":"f6707163fe7d97024e16933ca77a6f9407ff6474a011c98d6894e8baa7c4f728"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.489633 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.489691 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.130244 4722 generic.go:334] "Generic (PLEG): container finished" podID="e4315c1e-5007-4f92-b729-ac02cfdbc2ce" containerID="fe4b785db865789897ad91e43ca2bc211b16e8b4ffce9f9cbf68c41de08cee41" exitCode=0 Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.130381 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3385-account-create-update-qdqpt" event={"ID":"e4315c1e-5007-4f92-b729-ac02cfdbc2ce","Type":"ContainerDied","Data":"fe4b785db865789897ad91e43ca2bc211b16e8b4ffce9f9cbf68c41de08cee41"} Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.131811 4722 generic.go:334] "Generic (PLEG): container finished" podID="ecc9c4b4-0f7b-4309-aca4-57e977029936" containerID="84e5e27436da9beaab179cd560661b36041bc65052b69452c49dbc3b66f3802b" exitCode=0 Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.131849 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rsgbx-config-xbbk8" event={"ID":"ecc9c4b4-0f7b-4309-aca4-57e977029936","Type":"ContainerDied","Data":"84e5e27436da9beaab179cd560661b36041bc65052b69452c49dbc3b66f3802b"} Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.135686 4722 generic.go:334] "Generic (PLEG): container finished" podID="e2de5980-b357-42e1-8630-ea5b2751f224" containerID="db2672083ece02f74170f0c7cadfe50a27d9ef0c4917d7cd046cfc43ff213d6d" exitCode=0 Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.135739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cg47w" event={"ID":"e2de5980-b357-42e1-8630-ea5b2751f224","Type":"ContainerDied","Data":"db2672083ece02f74170f0c7cadfe50a27d9ef0c4917d7cd046cfc43ff213d6d"} Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.137249 4722 generic.go:334] "Generic (PLEG): container finished" podID="50d98fd3-85f9-400a-9492-7add2a485d7c" containerID="fdc3b554209a43390ea01e676568d1220b688044b067d00d45f3b650029baad6" exitCode=0 Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.137515 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" event={"ID":"50d98fd3-85f9-400a-9492-7add2a485d7c","Type":"ContainerDied","Data":"fdc3b554209a43390ea01e676568d1220b688044b067d00d45f3b650029baad6"} Feb 26 20:13:24 crc kubenswrapper[4722]: E0226 20:13:24.143963 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"object-server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-updater\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"rsync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"swift-recon-cron\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="29033310-ec4f-49d0-8899-349e3c6b02f9" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.595068 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.766311 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d8tl\" (UniqueName: \"kubernetes.io/projected/d56edfd6-ff9d-4a81-820c-250a94048683-kube-api-access-8d8tl\") pod \"d56edfd6-ff9d-4a81-820c-250a94048683\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.766384 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d56edfd6-ff9d-4a81-820c-250a94048683-operator-scripts\") pod \"d56edfd6-ff9d-4a81-820c-250a94048683\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.767003 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56edfd6-ff9d-4a81-820c-250a94048683-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d56edfd6-ff9d-4a81-820c-250a94048683" (UID: "d56edfd6-ff9d-4a81-820c-250a94048683"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.777767 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56edfd6-ff9d-4a81-820c-250a94048683-kube-api-access-8d8tl" (OuterVolumeSpecName: "kube-api-access-8d8tl") pod "d56edfd6-ff9d-4a81-820c-250a94048683" (UID: "d56edfd6-ff9d-4a81-820c-250a94048683"). InnerVolumeSpecName "kube-api-access-8d8tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.868459 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d56edfd6-ff9d-4a81-820c-250a94048683-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.868493 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d8tl\" (UniqueName: \"kubernetes.io/projected/d56edfd6-ff9d-4a81-820c-250a94048683-kube-api-access-8d8tl\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.898804 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-rsgbx" Feb 26 20:13:25 crc kubenswrapper[4722]: I0226 20:13:25.146897 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:25 crc kubenswrapper[4722]: I0226 20:13:25.157658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-aee4-account-create-update-pdt89" event={"ID":"d56edfd6-ff9d-4a81-820c-250a94048683","Type":"ContainerDied","Data":"9378b8d0e94b6e053bd2504ffcad37e4d06cdc9f2964dd97ad960e963e3c81b1"} Feb 26 20:13:25 crc kubenswrapper[4722]: I0226 20:13:25.157835 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9378b8d0e94b6e053bd2504ffcad37e4d06cdc9f2964dd97ad960e963e3c81b1" Feb 26 20:13:26 crc kubenswrapper[4722]: I0226 20:13:26.988007 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="082c8f6a-a03f-4567-891c-56b6aa6f26d3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.175196 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"751959d7-d249-457b-896e-fbc800f4d2bf","Type":"ContainerStarted","Data":"51f87a68fcc6c0f2e1be675bcdece8c74c481a8240fe85f25fc47d2f5244edd1"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.178340 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gdd4v" event={"ID":"2842874a-dd3a-44ba-ba7e-e0d8f41be944","Type":"ContainerDied","Data":"f6707163fe7d97024e16933ca77a6f9407ff6474a011c98d6894e8baa7c4f728"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.178385 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6707163fe7d97024e16933ca77a6f9407ff6474a011c98d6894e8baa7c4f728" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.181567 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-667ht" event={"ID":"3059b1f6-b323-4632-8296-c4eec81bb239","Type":"ContainerDied","Data":"ed1694aa86a37d8aec36b065e191bb50cfb3312bbbc66574cfeb4c1fb521d811"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.181598 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed1694aa86a37d8aec36b065e191bb50cfb3312bbbc66574cfeb4c1fb521d811" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.182747 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-xkflz" event={"ID":"d8205614-2f8f-4d32-8522-e76f6e7b9c69","Type":"ContainerDied","Data":"abb57b19ed740c5685e6909a23bc946396d5b69c64f92cb19126cee5a4047050"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.182773 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abb57b19ed740c5685e6909a23bc946396d5b69c64f92cb19126cee5a4047050" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.183804 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-974a-account-create-update-bszfn" event={"ID":"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5","Type":"ContainerDied","Data":"948bc5522c81a1953de7ef79a5d0cf9753d012b607af6cfa157a7a5d138093ff"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.183829 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="948bc5522c81a1953de7ef79a5d0cf9753d012b607af6cfa157a7a5d138093ff" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.184765 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3385-account-create-update-qdqpt" event={"ID":"e4315c1e-5007-4f92-b729-ac02cfdbc2ce","Type":"ContainerDied","Data":"f88cc916265770ff1705933b300f3f9614d14157aa1042bc2beec9fcce020043"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.184792 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f88cc916265770ff1705933b300f3f9614d14157aa1042bc2beec9fcce020043" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.186378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rsgbx-config-xbbk8" event={"ID":"ecc9c4b4-0f7b-4309-aca4-57e977029936","Type":"ContainerDied","Data":"cf36792a82d4e150706f8a645466b05d1fddac8b8fccbfdc867c2192d02f00ed"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.186421 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf36792a82d4e150706f8a645466b05d1fddac8b8fccbfdc867c2192d02f00ed" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.187603 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cg47w" event={"ID":"e2de5980-b357-42e1-8630-ea5b2751f224","Type":"ContainerDied","Data":"bfb9418238fc69cdfc530dfba02afc1ffceae9f26085ae60481e4bc59a7b8f26"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.187641 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfb9418238fc69cdfc530dfba02afc1ffceae9f26085ae60481e4bc59a7b8f26" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.188792 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" event={"ID":"50d98fd3-85f9-400a-9492-7add2a485d7c","Type":"ContainerDied","Data":"1879d1fbea64a77948b985c4a3c007e994b8b8ce610df3a49ff6298e6c844ec6"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.188820 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1879d1fbea64a77948b985c4a3c007e994b8b8ce610df3a49ff6298e6c844ec6" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.189899 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qtmxl" event={"ID":"4091b496-0010-42d3-97d6-281d47ae3f1c","Type":"ContainerDied","Data":"773c5aa5986ad60c4c0dcff86df70091bac9caa541fe4636df0b76a6c845ce1b"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.189926 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="773c5aa5986ad60c4c0dcff86df70091bac9caa541fe4636df0b76a6c845ce1b" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.196687 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.217833 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-667ht" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.243643 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.255733 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.267435 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.276197 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.287703 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.306600 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.308853 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/ecc9c4b4-0f7b-4309-aca4-57e977029936-kube-api-access-6kfdw\") pod \"ecc9c4b4-0f7b-4309-aca4-57e977029936\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329524 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-log-ovn\") pod \"ecc9c4b4-0f7b-4309-aca4-57e977029936\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329601 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxdxm\" (UniqueName: \"kubernetes.io/projected/3059b1f6-b323-4632-8296-c4eec81bb239-kube-api-access-mxdxm\") pod \"3059b1f6-b323-4632-8296-c4eec81bb239\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329675 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-scripts\") pod \"ecc9c4b4-0f7b-4309-aca4-57e977029936\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329720 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3059b1f6-b323-4632-8296-c4eec81bb239-operator-scripts\") pod \"3059b1f6-b323-4632-8296-c4eec81bb239\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329789 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run\") pod \"ecc9c4b4-0f7b-4309-aca4-57e977029936\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329840 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-additional-scripts\") pod \"ecc9c4b4-0f7b-4309-aca4-57e977029936\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329898 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run-ovn\") pod \"ecc9c4b4-0f7b-4309-aca4-57e977029936\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.330857 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ecc9c4b4-0f7b-4309-aca4-57e977029936" (UID: "ecc9c4b4-0f7b-4309-aca4-57e977029936"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.331371 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-scripts" (OuterVolumeSpecName: "scripts") pod "ecc9c4b4-0f7b-4309-aca4-57e977029936" (UID: "ecc9c4b4-0f7b-4309-aca4-57e977029936"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.331703 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3059b1f6-b323-4632-8296-c4eec81bb239-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3059b1f6-b323-4632-8296-c4eec81bb239" (UID: "3059b1f6-b323-4632-8296-c4eec81bb239"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.331750 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run" (OuterVolumeSpecName: "var-run") pod "ecc9c4b4-0f7b-4309-aca4-57e977029936" (UID: "ecc9c4b4-0f7b-4309-aca4-57e977029936"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.332228 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ecc9c4b4-0f7b-4309-aca4-57e977029936" (UID: "ecc9c4b4-0f7b-4309-aca4-57e977029936"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.332303 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ecc9c4b4-0f7b-4309-aca4-57e977029936" (UID: "ecc9c4b4-0f7b-4309-aca4-57e977029936"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.336346 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc9c4b4-0f7b-4309-aca4-57e977029936-kube-api-access-6kfdw" (OuterVolumeSpecName: "kube-api-access-6kfdw") pod "ecc9c4b4-0f7b-4309-aca4-57e977029936" (UID: "ecc9c4b4-0f7b-4309-aca4-57e977029936"). InnerVolumeSpecName "kube-api-access-6kfdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.336617 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3059b1f6-b323-4632-8296-c4eec81bb239-kube-api-access-mxdxm" (OuterVolumeSpecName: "kube-api-access-mxdxm") pod "3059b1f6-b323-4632-8296-c4eec81bb239" (UID: "3059b1f6-b323-4632-8296-c4eec81bb239"). InnerVolumeSpecName "kube-api-access-mxdxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431363 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc4vj\" (UniqueName: \"kubernetes.io/projected/e2de5980-b357-42e1-8630-ea5b2751f224-kube-api-access-tc4vj\") pod \"e2de5980-b357-42e1-8630-ea5b2751f224\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431437 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-operator-scripts\") pod \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431504 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8205614-2f8f-4d32-8522-e76f6e7b9c69-operator-scripts\") pod \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431529 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d98fd3-85f9-400a-9492-7add2a485d7c-operator-scripts\") pod \"50d98fd3-85f9-400a-9492-7add2a485d7c\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431553 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4091b496-0010-42d3-97d6-281d47ae3f1c-operator-scripts\") pod \"4091b496-0010-42d3-97d6-281d47ae3f1c\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431622 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjt7w\" (UniqueName: \"kubernetes.io/projected/d8205614-2f8f-4d32-8522-e76f6e7b9c69-kube-api-access-qjt7w\") pod \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431646 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxktb\" (UniqueName: \"kubernetes.io/projected/4091b496-0010-42d3-97d6-281d47ae3f1c-kube-api-access-zxktb\") pod \"4091b496-0010-42d3-97d6-281d47ae3f1c\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431692 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsrht\" (UniqueName: \"kubernetes.io/projected/2842874a-dd3a-44ba-ba7e-e0d8f41be944-kube-api-access-jsrht\") pod \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431734 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv5xk\" (UniqueName: \"kubernetes.io/projected/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-kube-api-access-jv5xk\") pod \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431817 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2842874a-dd3a-44ba-ba7e-e0d8f41be944-operator-scripts\") pod \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431881 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjc29\" (UniqueName: \"kubernetes.io/projected/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-kube-api-access-rjc29\") pod \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431908 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcbxw\" (UniqueName: \"kubernetes.io/projected/50d98fd3-85f9-400a-9492-7add2a485d7c-kube-api-access-xcbxw\") pod \"50d98fd3-85f9-400a-9492-7add2a485d7c\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431933 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-operator-scripts\") pod \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431970 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2de5980-b357-42e1-8630-ea5b2751f224-operator-scripts\") pod \"e2de5980-b357-42e1-8630-ea5b2751f224\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431961 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8205614-2f8f-4d32-8522-e76f6e7b9c69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8205614-2f8f-4d32-8522-e76f6e7b9c69" (UID: "d8205614-2f8f-4d32-8522-e76f6e7b9c69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432017 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e4315c1e-5007-4f92-b729-ac02cfdbc2ce" (UID: "e4315c1e-5007-4f92-b729-ac02cfdbc2ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432218 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50d98fd3-85f9-400a-9492-7add2a485d7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50d98fd3-85f9-400a-9492-7add2a485d7c" (UID: "50d98fd3-85f9-400a-9492-7add2a485d7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432435 4722 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432450 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxdxm\" (UniqueName: \"kubernetes.io/projected/3059b1f6-b323-4632-8296-c4eec81bb239-kube-api-access-mxdxm\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432460 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432469 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3059b1f6-b323-4632-8296-c4eec81bb239-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432477 4722 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432485 4722 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432493 4722 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432501 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432509 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8205614-2f8f-4d32-8522-e76f6e7b9c69-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432517 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d98fd3-85f9-400a-9492-7add2a485d7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432525 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/ecc9c4b4-0f7b-4309-aca4-57e977029936-kube-api-access-6kfdw\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432639 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2842874a-dd3a-44ba-ba7e-e0d8f41be944-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2842874a-dd3a-44ba-ba7e-e0d8f41be944" (UID: "2842874a-dd3a-44ba-ba7e-e0d8f41be944"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432887 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" (UID: "484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.433437 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4091b496-0010-42d3-97d6-281d47ae3f1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4091b496-0010-42d3-97d6-281d47ae3f1c" (UID: "4091b496-0010-42d3-97d6-281d47ae3f1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.433452 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2de5980-b357-42e1-8630-ea5b2751f224-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e2de5980-b357-42e1-8630-ea5b2751f224" (UID: "e2de5980-b357-42e1-8630-ea5b2751f224"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.434487 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2de5980-b357-42e1-8630-ea5b2751f224-kube-api-access-tc4vj" (OuterVolumeSpecName: "kube-api-access-tc4vj") pod "e2de5980-b357-42e1-8630-ea5b2751f224" (UID: "e2de5980-b357-42e1-8630-ea5b2751f224"). InnerVolumeSpecName "kube-api-access-tc4vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.435433 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4091b496-0010-42d3-97d6-281d47ae3f1c-kube-api-access-zxktb" (OuterVolumeSpecName: "kube-api-access-zxktb") pod "4091b496-0010-42d3-97d6-281d47ae3f1c" (UID: "4091b496-0010-42d3-97d6-281d47ae3f1c"). InnerVolumeSpecName "kube-api-access-zxktb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.436015 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8205614-2f8f-4d32-8522-e76f6e7b9c69-kube-api-access-qjt7w" (OuterVolumeSpecName: "kube-api-access-qjt7w") pod "d8205614-2f8f-4d32-8522-e76f6e7b9c69" (UID: "d8205614-2f8f-4d32-8522-e76f6e7b9c69"). InnerVolumeSpecName "kube-api-access-qjt7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.436100 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-kube-api-access-jv5xk" (OuterVolumeSpecName: "kube-api-access-jv5xk") pod "484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" (UID: "484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5"). InnerVolumeSpecName "kube-api-access-jv5xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.436117 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2842874a-dd3a-44ba-ba7e-e0d8f41be944-kube-api-access-jsrht" (OuterVolumeSpecName: "kube-api-access-jsrht") pod "2842874a-dd3a-44ba-ba7e-e0d8f41be944" (UID: "2842874a-dd3a-44ba-ba7e-e0d8f41be944"). InnerVolumeSpecName "kube-api-access-jsrht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.436517 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-kube-api-access-rjc29" (OuterVolumeSpecName: "kube-api-access-rjc29") pod "e4315c1e-5007-4f92-b729-ac02cfdbc2ce" (UID: "e4315c1e-5007-4f92-b729-ac02cfdbc2ce"). InnerVolumeSpecName "kube-api-access-rjc29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.437933 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d98fd3-85f9-400a-9492-7add2a485d7c-kube-api-access-xcbxw" (OuterVolumeSpecName: "kube-api-access-xcbxw") pod "50d98fd3-85f9-400a-9492-7add2a485d7c" (UID: "50d98fd3-85f9-400a-9492-7add2a485d7c"). InnerVolumeSpecName "kube-api-access-xcbxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533805 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4091b496-0010-42d3-97d6-281d47ae3f1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533845 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxktb\" (UniqueName: \"kubernetes.io/projected/4091b496-0010-42d3-97d6-281d47ae3f1c-kube-api-access-zxktb\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533858 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjt7w\" (UniqueName: \"kubernetes.io/projected/d8205614-2f8f-4d32-8522-e76f6e7b9c69-kube-api-access-qjt7w\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533868 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsrht\" (UniqueName: \"kubernetes.io/projected/2842874a-dd3a-44ba-ba7e-e0d8f41be944-kube-api-access-jsrht\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533877 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv5xk\" (UniqueName: \"kubernetes.io/projected/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-kube-api-access-jv5xk\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533887 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2842874a-dd3a-44ba-ba7e-e0d8f41be944-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533898 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjc29\" (UniqueName: \"kubernetes.io/projected/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-kube-api-access-rjc29\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533906 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcbxw\" (UniqueName: \"kubernetes.io/projected/50d98fd3-85f9-400a-9492-7add2a485d7c-kube-api-access-xcbxw\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533914 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533925 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2de5980-b357-42e1-8630-ea5b2751f224-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533933 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc4vj\" (UniqueName: \"kubernetes.io/projected/e2de5980-b357-42e1-8630-ea5b2751f224-kube-api-access-tc4vj\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.203307 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.203605 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204362 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204380 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204431 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-667ht" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204463 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x7zlz" event={"ID":"64b602b0-4c3e-4f7b-a1e8-961510e33097","Type":"ContainerStarted","Data":"709af2229d82f6605eead0b8402fa51607ff6d782d4b599858bccedf6dadce4b"} Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204517 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204546 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204757 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204806 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.257763 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-x7zlz" podStartSLOduration=13.036021357 podStartE2EDuration="17.257743091s" podCreationTimestamp="2026-02-26 20:13:11 +0000 UTC" firstStartedPulling="2026-02-26 20:13:22.843072687 +0000 UTC m=+1145.380040621" lastFinishedPulling="2026-02-26 20:13:27.064794431 +0000 UTC m=+1149.601762355" observedRunningTime="2026-02-26 20:13:28.237092332 +0000 UTC m=+1150.774060266" watchObservedRunningTime="2026-02-26 20:13:28.257743091 +0000 UTC m=+1150.794711015" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.337461 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rsgbx-config-xbbk8"] Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.355388 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rsgbx-config-xbbk8"] Feb 26 20:13:30 crc kubenswrapper[4722]: I0226 20:13:30.157207 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc9c4b4-0f7b-4309-aca4-57e977029936" path="/var/lib/kubelet/pods/ecc9c4b4-0f7b-4309-aca4-57e977029936/volumes" Feb 26 20:13:32 crc kubenswrapper[4722]: I0226 20:13:32.251870 4722 generic.go:334] "Generic (PLEG): container finished" podID="751959d7-d249-457b-896e-fbc800f4d2bf" containerID="51f87a68fcc6c0f2e1be675bcdece8c74c481a8240fe85f25fc47d2f5244edd1" exitCode=0 Feb 26 20:13:32 crc kubenswrapper[4722]: I0226 20:13:32.251972 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"751959d7-d249-457b-896e-fbc800f4d2bf","Type":"ContainerDied","Data":"51f87a68fcc6c0f2e1be675bcdece8c74c481a8240fe85f25fc47d2f5244edd1"} Feb 26 20:13:33 crc kubenswrapper[4722]: I0226 20:13:33.262493 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"751959d7-d249-457b-896e-fbc800f4d2bf","Type":"ContainerStarted","Data":"cf897ee9675fddb29112972ce30c68231ad360b163c501c1c8b70a5f10acf294"} Feb 26 20:13:35 crc kubenswrapper[4722]: I0226 20:13:35.280858 4722 generic.go:334] "Generic (PLEG): container finished" podID="64b602b0-4c3e-4f7b-a1e8-961510e33097" containerID="709af2229d82f6605eead0b8402fa51607ff6d782d4b599858bccedf6dadce4b" exitCode=0 Feb 26 20:13:35 crc kubenswrapper[4722]: I0226 20:13:35.280933 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x7zlz" event={"ID":"64b602b0-4c3e-4f7b-a1e8-961510e33097","Type":"ContainerDied","Data":"709af2229d82f6605eead0b8402fa51607ff6d782d4b599858bccedf6dadce4b"} Feb 26 20:13:35 crc kubenswrapper[4722]: I0226 20:13:35.284327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"751959d7-d249-457b-896e-fbc800f4d2bf","Type":"ContainerStarted","Data":"4cb5cdca58c759de4d3b8e61b32880c9c8b8ec90ec6598453d3b4f7aeae9f420"} Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.313213 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"751959d7-d249-457b-896e-fbc800f4d2bf","Type":"ContainerStarted","Data":"3a6b48a2238faa57bf379387899719706e456241d8fa17ccfd0f50be15dc74cf"} Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.352603 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.352585998 podStartE2EDuration="15.352585998s" podCreationTimestamp="2026-02-26 20:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:13:36.344010375 +0000 UTC m=+1158.880978309" watchObservedRunningTime="2026-02-26 20:13:36.352585998 +0000 UTC m=+1158.889553922" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.441982 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.442144 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.449185 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.630609 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.725755 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-combined-ca-bundle\") pod \"64b602b0-4c3e-4f7b-a1e8-961510e33097\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.725847 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldrrf\" (UniqueName: \"kubernetes.io/projected/64b602b0-4c3e-4f7b-a1e8-961510e33097-kube-api-access-ldrrf\") pod \"64b602b0-4c3e-4f7b-a1e8-961510e33097\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.725920 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-config-data\") pod \"64b602b0-4c3e-4f7b-a1e8-961510e33097\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.730702 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b602b0-4c3e-4f7b-a1e8-961510e33097-kube-api-access-ldrrf" (OuterVolumeSpecName: "kube-api-access-ldrrf") pod "64b602b0-4c3e-4f7b-a1e8-961510e33097" (UID: "64b602b0-4c3e-4f7b-a1e8-961510e33097"). InnerVolumeSpecName "kube-api-access-ldrrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.756683 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64b602b0-4c3e-4f7b-a1e8-961510e33097" (UID: "64b602b0-4c3e-4f7b-a1e8-961510e33097"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.777231 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-config-data" (OuterVolumeSpecName: "config-data") pod "64b602b0-4c3e-4f7b-a1e8-961510e33097" (UID: "64b602b0-4c3e-4f7b-a1e8-961510e33097"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.828217 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldrrf\" (UniqueName: \"kubernetes.io/projected/64b602b0-4c3e-4f7b-a1e8-961510e33097-kube-api-access-ldrrf\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.828247 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.828257 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.987281 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.324989 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n5jvb" event={"ID":"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf","Type":"ContainerStarted","Data":"7eedf1d8a450400cc8704bf31ca7049a5d892d6f9798e46abaa6c5643c5ae1e5"} Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.328081 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x7zlz" event={"ID":"64b602b0-4c3e-4f7b-a1e8-961510e33097","Type":"ContainerDied","Data":"9726f63a0e6edb0ffd84cee7004d452124571011ae6351565ba9ef74412889e8"} Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.328104 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.328120 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9726f63a0e6edb0ffd84cee7004d452124571011ae6351565ba9ef74412889e8" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.336787 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"da33fbe827b9388df95847364ac3401f8d109245f8fa6e0cbddb6b64f987672b"} Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.336952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"50a4680e83d7d274a1dc6c88a9792b9774f66b38e576b7fbf3b7bec2c43c477c"} Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.336973 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"4af95cb2fb1adf08a5ad10fb57e91542cf064cc3f5b5db02a6607bfa7ed52b7f"} Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.344443 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.364174 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-n5jvb" podStartSLOduration=2.51578758 podStartE2EDuration="33.364136475s" podCreationTimestamp="2026-02-26 20:13:04 +0000 UTC" firstStartedPulling="2026-02-26 20:13:05.34684556 +0000 UTC m=+1127.883813484" lastFinishedPulling="2026-02-26 20:13:36.195194455 +0000 UTC m=+1158.732162379" observedRunningTime="2026-02-26 20:13:37.35176659 +0000 UTC m=+1159.888734524" watchObservedRunningTime="2026-02-26 20:13:37.364136475 +0000 UTC m=+1159.901104389" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545325 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xbkst"] Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545761 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2842874a-dd3a-44ba-ba7e-e0d8f41be944" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545779 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2842874a-dd3a-44ba-ba7e-e0d8f41be944" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545791 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8205614-2f8f-4d32-8522-e76f6e7b9c69" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545798 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8205614-2f8f-4d32-8522-e76f6e7b9c69" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545806 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2de5980-b357-42e1-8630-ea5b2751f224" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545812 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2de5980-b357-42e1-8630-ea5b2751f224" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545822 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3059b1f6-b323-4632-8296-c4eec81bb239" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545827 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3059b1f6-b323-4632-8296-c4eec81bb239" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545836 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc9c4b4-0f7b-4309-aca4-57e977029936" containerName="ovn-config" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545842 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc9c4b4-0f7b-4309-aca4-57e977029936" containerName="ovn-config" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545850 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b602b0-4c3e-4f7b-a1e8-961510e33097" containerName="keystone-db-sync" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545856 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b602b0-4c3e-4f7b-a1e8-961510e33097" containerName="keystone-db-sync" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545865 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545871 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545889 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d98fd3-85f9-400a-9492-7add2a485d7c" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545896 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d98fd3-85f9-400a-9492-7add2a485d7c" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545907 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4315c1e-5007-4f92-b729-ac02cfdbc2ce" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545913 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4315c1e-5007-4f92-b729-ac02cfdbc2ce" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545927 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56edfd6-ff9d-4a81-820c-250a94048683" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545934 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56edfd6-ff9d-4a81-820c-250a94048683" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545946 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4091b496-0010-42d3-97d6-281d47ae3f1c" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545952 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4091b496-0010-42d3-97d6-281d47ae3f1c" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546107 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b602b0-4c3e-4f7b-a1e8-961510e33097" containerName="keystone-db-sync" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546118 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56edfd6-ff9d-4a81-820c-250a94048683" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546125 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3059b1f6-b323-4632-8296-c4eec81bb239" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546142 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4315c1e-5007-4f92-b729-ac02cfdbc2ce" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546173 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4091b496-0010-42d3-97d6-281d47ae3f1c" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546186 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2842874a-dd3a-44ba-ba7e-e0d8f41be944" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546195 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc9c4b4-0f7b-4309-aca4-57e977029936" containerName="ovn-config" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546209 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546220 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="50d98fd3-85f9-400a-9492-7add2a485d7c" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546229 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2de5980-b357-42e1-8630-ea5b2751f224" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546238 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8205614-2f8f-4d32-8522-e76f6e7b9c69" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546883 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.556736 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.556980 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.557159 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.557287 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nntfh"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.557483 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.557737 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v8sf5" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.559362 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.573119 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xbkst"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.602533 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nntfh"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646478 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-combined-ca-bundle\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646529 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646553 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-fernet-keys\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646607 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-config\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646635 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvgc2\" (UniqueName: \"kubernetes.io/projected/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-kube-api-access-wvgc2\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-scripts\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646688 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-credential-keys\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646707 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jjpc\" (UniqueName: \"kubernetes.io/projected/9f02859c-f39e-4ddd-9503-bfdccbbd534b-kube-api-access-5jjpc\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646728 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646746 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-config-data\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646767 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvgc2\" (UniqueName: \"kubernetes.io/projected/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-kube-api-access-wvgc2\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-scripts\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748484 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-credential-keys\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748500 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jjpc\" (UniqueName: \"kubernetes.io/projected/9f02859c-f39e-4ddd-9503-bfdccbbd534b-kube-api-access-5jjpc\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748517 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748535 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-config-data\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748551 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748606 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-combined-ca-bundle\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748631 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-fernet-keys\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748696 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-config\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.749654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-config\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.752174 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.753402 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.753901 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.754262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-config-data\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.759865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-credential-keys\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.765834 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-combined-ca-bundle\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.768262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-fernet-keys\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.785555 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-scripts\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.808230 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-m2kjh"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.810056 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.829663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jjpc\" (UniqueName: \"kubernetes.io/projected/9f02859c-f39e-4ddd-9503-bfdccbbd534b-kube-api-access-5jjpc\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.830227 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-z2nlt" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.830416 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.830450 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.847487 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-9bqd7"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.854074 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvgc2\" (UniqueName: \"kubernetes.io/projected/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-kube-api-access-wvgc2\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855123 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-combined-ca-bundle\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855173 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855203 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-config-data\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855223 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd5tb\" (UniqueName: \"kubernetes.io/projected/0f37d21c-75cb-471a-b68c-db4207ba0f6b-kube-api-access-pd5tb\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-scripts\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855294 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-db-sync-config-data\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855334 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f37d21c-75cb-471a-b68c-db4207ba0f6b-etc-machine-id\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.869594 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.869789 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-k7xwb" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.869910 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.870095 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.897306 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.902644 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.915237 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-b8gvr"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.916449 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.923041 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.923317 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.923334 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6xzhb" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.949079 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-m2kjh"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.966866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-combined-ca-bundle\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.966920 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-combined-ca-bundle\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.966954 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-config\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.966993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-config-data\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967015 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd5tb\" (UniqueName: \"kubernetes.io/projected/0f37d21c-75cb-471a-b68c-db4207ba0f6b-kube-api-access-pd5tb\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-config-data\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967063 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-certs\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967099 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-scripts\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4g2q\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-kube-api-access-d4g2q\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967159 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-db-sync-config-data\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967181 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh8tn\" (UniqueName: \"kubernetes.io/projected/b8a5702a-6bfd-4f8d-a522-f0460c092b52-kube-api-access-nh8tn\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967220 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-combined-ca-bundle\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967238 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f37d21c-75cb-471a-b68c-db4207ba0f6b-etc-machine-id\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967274 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-scripts\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.977343 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f37d21c-75cb-471a-b68c-db4207ba0f6b-etc-machine-id\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.987122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-combined-ca-bundle\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.987889 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-config-data\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.990171 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-scripts\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.995574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-db-sync-config-data\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.995638 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-9bqd7"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.019848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd5tb\" (UniqueName: \"kubernetes.io/projected/0f37d21c-75cb-471a-b68c-db4207ba0f6b-kube-api-access-pd5tb\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.045106 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b8gvr"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.069496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-scripts\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.069565 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-combined-ca-bundle\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.079793 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-config\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.079995 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-config-data\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.080021 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-certs\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.080129 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4g2q\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-kube-api-access-d4g2q\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.081570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh8tn\" (UniqueName: \"kubernetes.io/projected/b8a5702a-6bfd-4f8d-a522-f0460c092b52-kube-api-access-nh8tn\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.081729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-combined-ca-bundle\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.094776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-scripts\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.126973 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-combined-ca-bundle\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.129613 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4g2q\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-kube-api-access-d4g2q\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.129853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-config-data\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.129846 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nntfh"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.132463 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-combined-ca-bundle\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.132763 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh8tn\" (UniqueName: \"kubernetes.io/projected/b8a5702a-6bfd-4f8d-a522-f0460c092b52-kube-api-access-nh8tn\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.135022 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-certs\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.139712 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-config\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.181265 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.186308 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.191251 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.191461 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.193323 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-79m6p"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.194731 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.196439 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.196697 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qjfvw" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.219745 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.224929 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.231805 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-79m6p"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.252189 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.253254 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-75cgb"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.261028 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.280663 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.280839 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-h94hg"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.283316 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.289252 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-75cgb"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290274 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-scripts\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290320 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290380 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tzkb\" (UniqueName: \"kubernetes.io/projected/3d551533-7396-4941-a62c-b1a0039f6ddc-kube-api-access-2tzkb\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290421 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290439 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-log-httpd\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290471 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-db-sync-config-data\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-combined-ca-bundle\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290503 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-run-httpd\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290520 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-config-data\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290590 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwwzp\" (UniqueName: \"kubernetes.io/projected/834d875f-efb0-42d3-8aad-fd7a7209cbeb-kube-api-access-kwwzp\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.301652 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.301824 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dflrm" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.301859 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.319527 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-h94hg"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.387769 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"dd5d67e676f2414654f0503af5bbe9faa8eb192f1a8cfaf617879de18964047d"} Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.391893 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tzkb\" (UniqueName: \"kubernetes.io/projected/3d551533-7396-4941-a62c-b1a0039f6ddc-kube-api-access-2tzkb\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.391956 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.391975 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-log-httpd\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392012 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-db-sync-config-data\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392036 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-config-data\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392052 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-combined-ca-bundle\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392068 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-run-httpd\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392090 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-config-data\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392180 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4k88\" (UniqueName: \"kubernetes.io/projected/0651c832-c66b-4004-8564-ff8a4b2c002e-kube-api-access-r4k88\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392202 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwwzp\" (UniqueName: \"kubernetes.io/projected/834d875f-efb0-42d3-8aad-fd7a7209cbeb-kube-api-access-kwwzp\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392221 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm6gh\" (UniqueName: \"kubernetes.io/projected/f7f3da1b-cb51-4235-8d61-d44ba069528c-kube-api-access-xm6gh\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-scripts\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392261 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f3da1b-cb51-4235-8d61-d44ba069528c-logs\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392297 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-config\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-combined-ca-bundle\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392371 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-scripts\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392396 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4k88\" (UniqueName: \"kubernetes.io/projected/0651c832-c66b-4004-8564-ff8a4b2c002e-kube-api-access-r4k88\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494315 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm6gh\" (UniqueName: \"kubernetes.io/projected/f7f3da1b-cb51-4235-8d61-d44ba069528c-kube-api-access-xm6gh\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494354 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-scripts\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f3da1b-cb51-4235-8d61-d44ba069528c-logs\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494472 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-config\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494572 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-combined-ca-bundle\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-config-data\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494870 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.554837 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tzkb\" (UniqueName: \"kubernetes.io/projected/3d551533-7396-4941-a62c-b1a0039f6ddc-kube-api-access-2tzkb\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.557278 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f3da1b-cb51-4235-8d61-d44ba069528c-logs\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.558737 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.561323 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-combined-ca-bundle\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.563315 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-config\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.563325 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.563988 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.567663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-scripts\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.567977 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-db-sync-config-data\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.569578 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-config-data\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.583378 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4k88\" (UniqueName: \"kubernetes.io/projected/0651c832-c66b-4004-8564-ff8a4b2c002e-kube-api-access-r4k88\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.584118 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm6gh\" (UniqueName: \"kubernetes.io/projected/f7f3da1b-cb51-4235-8d61-d44ba069528c-kube-api-access-xm6gh\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.595209 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-combined-ca-bundle\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.621446 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-log-httpd\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.621652 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-run-httpd\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.623197 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.627194 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-scripts\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.627882 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.628760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwwzp\" (UniqueName: \"kubernetes.io/projected/834d875f-efb0-42d3-8aad-fd7a7209cbeb-kube-api-access-kwwzp\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.630093 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-config-data\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.748654 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xbkst"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.820637 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nntfh"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.844954 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.846913 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.848752 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.850545 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: W0226 20:13:38.904835 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf29a7c1b_064e_439e_8fca_5f5f3d323dd9.slice/crio-3c5df9ac1117ec16092fd2bc353e37beff81b9ca0306b37e114fc41547ce3704 WatchSource:0}: Error finding container 3c5df9ac1117ec16092fd2bc353e37beff81b9ca0306b37e114fc41547ce3704: Status 404 returned error can't find the container with id 3c5df9ac1117ec16092fd2bc353e37beff81b9ca0306b37e114fc41547ce3704 Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.162708 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-m2kjh"] Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.456251 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b8gvr"] Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.470364 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbkst" event={"ID":"9f02859c-f39e-4ddd-9503-bfdccbbd534b","Type":"ContainerStarted","Data":"cbaf1de08041e95ac2f560208b5dbd617f5b72e203eb1adb1d471de359dd6904"} Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.483688 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-9bqd7"] Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.543904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"f8d5bd1bf6848c59bda02692ecfe0c1552fa3a1d5d88f65cfce4e0ebdaa243e9"} Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.559267 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" event={"ID":"f29a7c1b-064e-439e-8fca-5f5f3d323dd9","Type":"ContainerStarted","Data":"3c5df9ac1117ec16092fd2bc353e37beff81b9ca0306b37e114fc41547ce3704"} Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.577537 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m2kjh" event={"ID":"0f37d21c-75cb-471a-b68c-db4207ba0f6b","Type":"ContainerStarted","Data":"fab44ccf12f07bc37cdac5fc33a8e02c284c3c84e1db6271b013092a599849ce"} Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.767700 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.822631 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-75cgb"] Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.847019 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-79m6p"] Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.978123 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-h94hg"] Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.001061 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:13:40 crc kubenswrapper[4722]: W0226 20:13:40.015882 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d551533_7396_4941_a62c_b1a0039f6ddc.slice/crio-24c214dccf78e5179e296050dc2ec03f2a234498cc90e5e556bcc1533a2a20b5 WatchSource:0}: Error finding container 24c214dccf78e5179e296050dc2ec03f2a234498cc90e5e556bcc1533a2a20b5: Status 404 returned error can't find the container with id 24c214dccf78e5179e296050dc2ec03f2a234498cc90e5e556bcc1533a2a20b5 Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.610372 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-9bqd7" event={"ID":"04f47952-580e-40b8-80f0-25d1bf8ccc22","Type":"ContainerStarted","Data":"b3b2d2e9303517af7c490ec7734224121942206b4d90753d5e60281ef874a9ba"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.613200 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerStarted","Data":"db737bb35890c1c6ada44a53fbe5b35f5ec6b4917823fc3fd7aa46e8919c0258"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.616438 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbkst" event={"ID":"9f02859c-f39e-4ddd-9503-bfdccbbd534b","Type":"ContainerStarted","Data":"81dc8ce724e1025b2d25fe14f2b9bb694db4be3db85ce12a895a7e230ea03925"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.636420 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"3834c16b3dcb910a52de373cbb1f4abfd47f6b9ef16d4622fb5827249fd13be4"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.637288 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xbkst" podStartSLOduration=3.6372609860000003 podStartE2EDuration="3.637260986s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:13:40.636549887 +0000 UTC m=+1163.173517821" watchObservedRunningTime="2026-02-26 20:13:40.637260986 +0000 UTC m=+1163.174228910" Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.637917 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-79m6p" event={"ID":"3d551533-7396-4941-a62c-b1a0039f6ddc","Type":"ContainerStarted","Data":"24c214dccf78e5179e296050dc2ec03f2a234498cc90e5e556bcc1533a2a20b5"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.639236 4722 generic.go:334] "Generic (PLEG): container finished" podID="f29a7c1b-064e-439e-8fca-5f5f3d323dd9" containerID="2c8d2d40699e9eca78b7067a7610d966c255b87c40d2ba45ec5ea6d9622f6ee9" exitCode=0 Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.639272 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" event={"ID":"f29a7c1b-064e-439e-8fca-5f5f3d323dd9","Type":"ContainerDied","Data":"2c8d2d40699e9eca78b7067a7610d966c255b87c40d2ba45ec5ea6d9622f6ee9"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.643630 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b8gvr" event={"ID":"b8a5702a-6bfd-4f8d-a522-f0460c092b52","Type":"ContainerStarted","Data":"deced704a3f40b9c7d276308aecb3a6d761c83341556aa3c96ad830a15d091b8"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.643668 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b8gvr" event={"ID":"b8a5702a-6bfd-4f8d-a522-f0460c092b52","Type":"ContainerStarted","Data":"abbdb40762fd75bc7aee34dc669ccdafcd3271e6b81137a3963f9d0f7a91f1d3"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.646620 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" event={"ID":"0651c832-c66b-4004-8564-ff8a4b2c002e","Type":"ContainerStarted","Data":"38fd8a7e50782529f9d4d5f35cf50a7969adc433daff84796eca55ad102ba45a"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.648829 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h94hg" event={"ID":"f7f3da1b-cb51-4235-8d61-d44ba069528c","Type":"ContainerStarted","Data":"7b6dc1e6b68cd7b785b8f0b42d11a88ccd93526c6696cc6ba4f29cd519d896d8"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.708651 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=23.137544999 podStartE2EDuration="54.7086327s" podCreationTimestamp="2026-02-26 20:12:46 +0000 UTC" firstStartedPulling="2026-02-26 20:13:04.623307163 +0000 UTC m=+1127.160275087" lastFinishedPulling="2026-02-26 20:13:36.194394864 +0000 UTC m=+1158.731362788" observedRunningTime="2026-02-26 20:13:40.677412264 +0000 UTC m=+1163.214380198" watchObservedRunningTime="2026-02-26 20:13:40.7086327 +0000 UTC m=+1163.245600624" Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.738951 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-b8gvr" podStartSLOduration=3.73892933 podStartE2EDuration="3.73892933s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:13:40.728237371 +0000 UTC m=+1163.265205295" watchObservedRunningTime="2026-02-26 20:13:40.73892933 +0000 UTC m=+1163.275897254" Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.960005 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-75cgb"] Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.984595 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-lcmxp"] Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.986193 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.989465 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.052810 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-lcmxp"] Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.083647 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.113569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-config\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.113634 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.113661 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.113696 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.113765 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6mb6\" (UniqueName: \"kubernetes.io/projected/29b8dfbb-ff67-4a15-b078-0f7abe623431-kube-api-access-w6mb6\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.113816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.214856 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvgc2\" (UniqueName: \"kubernetes.io/projected/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-kube-api-access-wvgc2\") pod \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.214929 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-config\") pod \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.214997 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-sb\") pod \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215059 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-dns-svc\") pod \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215093 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-nb\") pod \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215323 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6mb6\" (UniqueName: \"kubernetes.io/projected/29b8dfbb-ff67-4a15-b078-0f7abe623431-kube-api-access-w6mb6\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215366 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-config\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215478 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215522 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.216330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.216826 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-config\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.216928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.217557 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.218399 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.234764 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-kube-api-access-wvgc2" (OuterVolumeSpecName: "kube-api-access-wvgc2") pod "f29a7c1b-064e-439e-8fca-5f5f3d323dd9" (UID: "f29a7c1b-064e-439e-8fca-5f5f3d323dd9"). InnerVolumeSpecName "kube-api-access-wvgc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.240381 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6mb6\" (UniqueName: \"kubernetes.io/projected/29b8dfbb-ff67-4a15-b078-0f7abe623431-kube-api-access-w6mb6\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.252486 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-config" (OuterVolumeSpecName: "config") pod "f29a7c1b-064e-439e-8fca-5f5f3d323dd9" (UID: "f29a7c1b-064e-439e-8fca-5f5f3d323dd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.268502 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f29a7c1b-064e-439e-8fca-5f5f3d323dd9" (UID: "f29a7c1b-064e-439e-8fca-5f5f3d323dd9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.277119 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f29a7c1b-064e-439e-8fca-5f5f3d323dd9" (UID: "f29a7c1b-064e-439e-8fca-5f5f3d323dd9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.285102 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f29a7c1b-064e-439e-8fca-5f5f3d323dd9" (UID: "f29a7c1b-064e-439e-8fca-5f5f3d323dd9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.317552 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.317897 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.319771 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.319784 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.319795 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvgc2\" (UniqueName: \"kubernetes.io/projected/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-kube-api-access-wvgc2\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.401816 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.673208 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" event={"ID":"f29a7c1b-064e-439e-8fca-5f5f3d323dd9","Type":"ContainerDied","Data":"3c5df9ac1117ec16092fd2bc353e37beff81b9ca0306b37e114fc41547ce3704"} Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.673282 4722 scope.go:117] "RemoveContainer" containerID="2c8d2d40699e9eca78b7067a7610d966c255b87c40d2ba45ec5ea6d9622f6ee9" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.673433 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.680265 4722 generic.go:334] "Generic (PLEG): container finished" podID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerID="3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277" exitCode=0 Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.681471 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" event={"ID":"0651c832-c66b-4004-8564-ff8a4b2c002e","Type":"ContainerDied","Data":"3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277"} Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.856041 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nntfh"] Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.879234 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nntfh"] Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.033003 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-lcmxp"] Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.159701 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f29a7c1b-064e-439e-8fca-5f5f3d323dd9" path="/var/lib/kubelet/pods/f29a7c1b-064e-439e-8fca-5f5f3d323dd9/volumes" Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.717521 4722 generic.go:334] "Generic (PLEG): container finished" podID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerID="a695eeeb295dd6f2121a919c4d6962b1fa2ad86a319d1c4f25385cdc0c97bfce" exitCode=0 Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.717611 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" event={"ID":"29b8dfbb-ff67-4a15-b078-0f7abe623431","Type":"ContainerDied","Data":"a695eeeb295dd6f2121a919c4d6962b1fa2ad86a319d1c4f25385cdc0c97bfce"} Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.717935 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" event={"ID":"29b8dfbb-ff67-4a15-b078-0f7abe623431","Type":"ContainerStarted","Data":"7967cef7aeddeb15162c1de8e5c92229cffd11f032094d514f5c0f541eb96ee7"} Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.723896 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" event={"ID":"0651c832-c66b-4004-8564-ff8a4b2c002e","Type":"ContainerStarted","Data":"61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead"} Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.724034 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerName="dnsmasq-dns" containerID="cri-o://61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead" gracePeriod=10 Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.724279 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.774277 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" podStartSLOduration=5.774256306 podStartE2EDuration="5.774256306s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:13:42.757742028 +0000 UTC m=+1165.294709982" watchObservedRunningTime="2026-02-26 20:13:42.774256306 +0000 UTC m=+1165.311224230" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.384085 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.478732 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-config\") pod \"0651c832-c66b-4004-8564-ff8a4b2c002e\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.478819 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-dns-svc\") pod \"0651c832-c66b-4004-8564-ff8a4b2c002e\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.478916 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4k88\" (UniqueName: \"kubernetes.io/projected/0651c832-c66b-4004-8564-ff8a4b2c002e-kube-api-access-r4k88\") pod \"0651c832-c66b-4004-8564-ff8a4b2c002e\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.478945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-nb\") pod \"0651c832-c66b-4004-8564-ff8a4b2c002e\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.479008 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-sb\") pod \"0651c832-c66b-4004-8564-ff8a4b2c002e\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.487353 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0651c832-c66b-4004-8564-ff8a4b2c002e-kube-api-access-r4k88" (OuterVolumeSpecName: "kube-api-access-r4k88") pod "0651c832-c66b-4004-8564-ff8a4b2c002e" (UID: "0651c832-c66b-4004-8564-ff8a4b2c002e"). InnerVolumeSpecName "kube-api-access-r4k88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.552733 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0651c832-c66b-4004-8564-ff8a4b2c002e" (UID: "0651c832-c66b-4004-8564-ff8a4b2c002e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.552906 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0651c832-c66b-4004-8564-ff8a4b2c002e" (UID: "0651c832-c66b-4004-8564-ff8a4b2c002e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.582929 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4k88\" (UniqueName: \"kubernetes.io/projected/0651c832-c66b-4004-8564-ff8a4b2c002e-kube-api-access-r4k88\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.582969 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.582982 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.585855 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0651c832-c66b-4004-8564-ff8a4b2c002e" (UID: "0651c832-c66b-4004-8564-ff8a4b2c002e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.605853 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-config" (OuterVolumeSpecName: "config") pod "0651c832-c66b-4004-8564-ff8a4b2c002e" (UID: "0651c832-c66b-4004-8564-ff8a4b2c002e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.684716 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.684751 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.757297 4722 generic.go:334] "Generic (PLEG): container finished" podID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerID="61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead" exitCode=0 Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.757369 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.757385 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" event={"ID":"0651c832-c66b-4004-8564-ff8a4b2c002e","Type":"ContainerDied","Data":"61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead"} Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.757413 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" event={"ID":"0651c832-c66b-4004-8564-ff8a4b2c002e","Type":"ContainerDied","Data":"38fd8a7e50782529f9d4d5f35cf50a7969adc433daff84796eca55ad102ba45a"} Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.757447 4722 scope.go:117] "RemoveContainer" containerID="61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.760705 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" event={"ID":"29b8dfbb-ff67-4a15-b078-0f7abe623431","Type":"ContainerStarted","Data":"8b090bf3aaebcb88b0c2a76597cf1496e6eac0069a0cf428a88c4a6c7ab51500"} Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.761874 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.790565 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" podStartSLOduration=3.790523511 podStartE2EDuration="3.790523511s" podCreationTimestamp="2026-02-26 20:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:13:43.787436618 +0000 UTC m=+1166.324404562" watchObservedRunningTime="2026-02-26 20:13:43.790523511 +0000 UTC m=+1166.327491465" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.818267 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-75cgb"] Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.843249 4722 scope.go:117] "RemoveContainer" containerID="3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.863174 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-75cgb"] Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.899396 4722 scope.go:117] "RemoveContainer" containerID="61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead" Feb 26 20:13:43 crc kubenswrapper[4722]: E0226 20:13:43.899897 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead\": container with ID starting with 61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead not found: ID does not exist" containerID="61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.899934 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead"} err="failed to get container status \"61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead\": rpc error: code = NotFound desc = could not find container \"61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead\": container with ID starting with 61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead not found: ID does not exist" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.899960 4722 scope.go:117] "RemoveContainer" containerID="3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277" Feb 26 20:13:43 crc kubenswrapper[4722]: E0226 20:13:43.901770 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277\": container with ID starting with 3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277 not found: ID does not exist" containerID="3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.901805 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277"} err="failed to get container status \"3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277\": rpc error: code = NotFound desc = could not find container \"3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277\": container with ID starting with 3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277 not found: ID does not exist" Feb 26 20:13:44 crc kubenswrapper[4722]: I0226 20:13:44.162207 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" path="/var/lib/kubelet/pods/0651c832-c66b-4004-8564-ff8a4b2c002e/volumes" Feb 26 20:13:44 crc kubenswrapper[4722]: I0226 20:13:44.775207 4722 generic.go:334] "Generic (PLEG): container finished" podID="9f02859c-f39e-4ddd-9503-bfdccbbd534b" containerID="81dc8ce724e1025b2d25fe14f2b9bb694db4be3db85ce12a895a7e230ea03925" exitCode=0 Feb 26 20:13:44 crc kubenswrapper[4722]: I0226 20:13:44.775265 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbkst" event={"ID":"9f02859c-f39e-4ddd-9503-bfdccbbd534b","Type":"ContainerDied","Data":"81dc8ce724e1025b2d25fe14f2b9bb694db4be3db85ce12a895a7e230ea03925"} Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.378404 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.550364 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jjpc\" (UniqueName: \"kubernetes.io/projected/9f02859c-f39e-4ddd-9503-bfdccbbd534b-kube-api-access-5jjpc\") pod \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.550777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-config-data\") pod \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.550817 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-fernet-keys\") pod \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.551001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-credential-keys\") pod \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.551076 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-scripts\") pod \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.551175 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-combined-ca-bundle\") pod \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.556623 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9f02859c-f39e-4ddd-9503-bfdccbbd534b" (UID: "9f02859c-f39e-4ddd-9503-bfdccbbd534b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.557330 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9f02859c-f39e-4ddd-9503-bfdccbbd534b" (UID: "9f02859c-f39e-4ddd-9503-bfdccbbd534b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.559501 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-scripts" (OuterVolumeSpecName: "scripts") pod "9f02859c-f39e-4ddd-9503-bfdccbbd534b" (UID: "9f02859c-f39e-4ddd-9503-bfdccbbd534b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.561210 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f02859c-f39e-4ddd-9503-bfdccbbd534b-kube-api-access-5jjpc" (OuterVolumeSpecName: "kube-api-access-5jjpc") pod "9f02859c-f39e-4ddd-9503-bfdccbbd534b" (UID: "9f02859c-f39e-4ddd-9503-bfdccbbd534b"). InnerVolumeSpecName "kube-api-access-5jjpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.584549 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-config-data" (OuterVolumeSpecName: "config-data") pod "9f02859c-f39e-4ddd-9503-bfdccbbd534b" (UID: "9f02859c-f39e-4ddd-9503-bfdccbbd534b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.598536 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f02859c-f39e-4ddd-9503-bfdccbbd534b" (UID: "9f02859c-f39e-4ddd-9503-bfdccbbd534b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.653996 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jjpc\" (UniqueName: \"kubernetes.io/projected/9f02859c-f39e-4ddd-9503-bfdccbbd534b-kube-api-access-5jjpc\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.654031 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.654043 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.654051 4722 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.654059 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.654068 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.182978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbkst" event={"ID":"9f02859c-f39e-4ddd-9503-bfdccbbd534b","Type":"ContainerDied","Data":"cbaf1de08041e95ac2f560208b5dbd617f5b72e203eb1adb1d471de359dd6904"} Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.183029 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbaf1de08041e95ac2f560208b5dbd617f5b72e203eb1adb1d471de359dd6904" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.183104 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.188315 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xbkst"] Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.203062 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xbkst"] Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.219820 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7s744"] Feb 26 20:13:47 crc kubenswrapper[4722]: E0226 20:13:47.220658 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerName="dnsmasq-dns" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.220682 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerName="dnsmasq-dns" Feb 26 20:13:47 crc kubenswrapper[4722]: E0226 20:13:47.220757 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29a7c1b-064e-439e-8fca-5f5f3d323dd9" containerName="init" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.220766 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29a7c1b-064e-439e-8fca-5f5f3d323dd9" containerName="init" Feb 26 20:13:47 crc kubenswrapper[4722]: E0226 20:13:47.220816 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f02859c-f39e-4ddd-9503-bfdccbbd534b" containerName="keystone-bootstrap" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.220827 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f02859c-f39e-4ddd-9503-bfdccbbd534b" containerName="keystone-bootstrap" Feb 26 20:13:47 crc kubenswrapper[4722]: E0226 20:13:47.220853 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerName="init" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.220899 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerName="init" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.221398 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29a7c1b-064e-439e-8fca-5f5f3d323dd9" containerName="init" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.221435 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f02859c-f39e-4ddd-9503-bfdccbbd534b" containerName="keystone-bootstrap" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.221448 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerName="dnsmasq-dns" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.222486 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.223999 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.224681 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.224896 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v8sf5" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.224925 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.228354 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7s744"] Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.368267 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-fernet-keys\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.368319 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-combined-ca-bundle\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.368366 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-scripts\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.368414 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6b86\" (UniqueName: \"kubernetes.io/projected/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-kube-api-access-m6b86\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.368450 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-config-data\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.368518 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-credential-keys\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.470521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6b86\" (UniqueName: \"kubernetes.io/projected/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-kube-api-access-m6b86\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.470584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-config-data\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.470643 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-credential-keys\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.470712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-fernet-keys\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.470733 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-combined-ca-bundle\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.470764 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-scripts\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.475576 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-combined-ca-bundle\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.476290 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-config-data\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.476658 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-credential-keys\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.482632 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-scripts\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.487510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6b86\" (UniqueName: \"kubernetes.io/projected/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-kube-api-access-m6b86\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.495116 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-fernet-keys\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.539000 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:48 crc kubenswrapper[4722]: I0226 20:13:48.162553 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f02859c-f39e-4ddd-9503-bfdccbbd534b" path="/var/lib/kubelet/pods/9f02859c-f39e-4ddd-9503-bfdccbbd534b/volumes" Feb 26 20:13:49 crc kubenswrapper[4722]: I0226 20:13:49.204677 4722 generic.go:334] "Generic (PLEG): container finished" podID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" containerID="7eedf1d8a450400cc8704bf31ca7049a5d892d6f9798e46abaa6c5643c5ae1e5" exitCode=0 Feb 26 20:13:49 crc kubenswrapper[4722]: I0226 20:13:49.204764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n5jvb" event={"ID":"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf","Type":"ContainerDied","Data":"7eedf1d8a450400cc8704bf31ca7049a5d892d6f9798e46abaa6c5643c5ae1e5"} Feb 26 20:13:51 crc kubenswrapper[4722]: I0226 20:13:51.403329 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:51 crc kubenswrapper[4722]: I0226 20:13:51.468598 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6v647"] Feb 26 20:13:51 crc kubenswrapper[4722]: I0226 20:13:51.468810 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" containerID="cri-o://4a129b8c1723572fe4add0f6ebd0ad819a9755d241f4d2d09aa4fac6abaef325" gracePeriod=10 Feb 26 20:13:52 crc kubenswrapper[4722]: I0226 20:13:52.232770 4722 generic.go:334] "Generic (PLEG): container finished" podID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerID="4a129b8c1723572fe4add0f6ebd0ad819a9755d241f4d2d09aa4fac6abaef325" exitCode=0 Feb 26 20:13:52 crc kubenswrapper[4722]: I0226 20:13:52.232878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" event={"ID":"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40","Type":"ContainerDied","Data":"4a129b8c1723572fe4add0f6ebd0ad819a9755d241f4d2d09aa4fac6abaef325"} Feb 26 20:13:53 crc kubenswrapper[4722]: I0226 20:13:53.487635 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:13:53 crc kubenswrapper[4722]: I0226 20:13:53.487689 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.387174 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.401167 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.562058 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-config-data\") pod \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.562132 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-combined-ca-bundle\") pod \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.562172 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-db-sync-config-data\") pod \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.562265 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvrds\" (UniqueName: \"kubernetes.io/projected/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-kube-api-access-jvrds\") pod \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.574434 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" (UID: "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.574631 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-kube-api-access-jvrds" (OuterVolumeSpecName: "kube-api-access-jvrds") pod "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" (UID: "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf"). InnerVolumeSpecName "kube-api-access-jvrds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.589029 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" (UID: "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.618636 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-config-data" (OuterVolumeSpecName: "config-data") pod "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" (UID: "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.665166 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.665209 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.665225 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.665239 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvrds\" (UniqueName: \"kubernetes.io/projected/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-kube-api-access-jvrds\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.280763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n5jvb" event={"ID":"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf","Type":"ContainerDied","Data":"3f89cead2f4287d2653a27b8a6d1d3aabd53ed3fa926ed5b7520393f648a1a6d"} Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.280801 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f89cead2f4287d2653a27b8a6d1d3aabd53ed3fa926ed5b7520393f648a1a6d" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.280873 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:57 crc kubenswrapper[4722]: E0226 20:13:57.708856 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 26 20:13:57 crc kubenswrapper[4722]: E0226 20:13:57.709376 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2tzkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-79m6p_openstack(3d551533-7396-4941-a62c-b1a0039f6ddc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:13:57 crc kubenswrapper[4722]: E0226 20:13:57.711047 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-79m6p" podUID="3d551533-7396-4941-a62c-b1a0039f6ddc" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.839404 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bcw66"] Feb 26 20:13:57 crc kubenswrapper[4722]: E0226 20:13:57.840597 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" containerName="glance-db-sync" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.840618 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" containerName="glance-db-sync" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.840814 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" containerName="glance-db-sync" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.841904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.855354 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bcw66"] Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.892240 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.892284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.892315 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.892408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-config\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.892457 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9t8v\" (UniqueName: \"kubernetes.io/projected/00da7f47-5a02-488d-99df-113c54217bcc-kube-api-access-n9t8v\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.892497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.994102 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-config\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.994196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9t8v\" (UniqueName: \"kubernetes.io/projected/00da7f47-5a02-488d-99df-113c54217bcc-kube-api-access-n9t8v\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.994239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.994270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.994290 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.994316 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.995419 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.996016 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-config\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.996854 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.996991 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.997485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.015240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9t8v\" (UniqueName: \"kubernetes.io/projected/00da7f47-5a02-488d-99df-113c54217bcc-kube-api-access-n9t8v\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.172562 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:58 crc kubenswrapper[4722]: E0226 20:13:58.292860 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-79m6p" podUID="3d551533-7396-4941-a62c-b1a0039f6ddc" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.723469 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.727130 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.734689 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.734701 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bxdpq" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.734910 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.742945 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.912831 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.912905 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.913330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsccz\" (UniqueName: \"kubernetes.io/projected/fc86f06d-19f3-419d-bcf3-97376fb95f01-kube-api-access-jsccz\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.913451 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.913523 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.913642 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.913667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-logs\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.001194 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.002789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.005331 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015101 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015322 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsccz\" (UniqueName: \"kubernetes.io/projected/fc86f06d-19f3-419d-bcf3-97376fb95f01-kube-api-access-jsccz\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015382 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015428 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-logs\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.016111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-logs\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.017553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.026295 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.026440 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.026480 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7f95abf803007e35619a86adf06d86b927c4178d94ba29cbe93b3d6d49c63693/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.036249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.036782 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.052048 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsccz\" (UniqueName: \"kubernetes.io/projected/fc86f06d-19f3-419d-bcf3-97376fb95f01-kube-api-access-jsccz\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.060385 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.090563 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.117755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.118081 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.118105 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klrns\" (UniqueName: \"kubernetes.io/projected/1723b7a4-a96d-4144-b4cb-3e5735a38667-kube-api-access-klrns\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.118170 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-logs\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.118215 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.118345 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.118376 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.219884 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-logs\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.219995 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.220090 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.220120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.220197 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.220260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.220293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klrns\" (UniqueName: \"kubernetes.io/projected/1723b7a4-a96d-4144-b4cb-3e5735a38667-kube-api-access-klrns\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.220349 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-logs\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.221189 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.222726 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.222756 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/88f1a6e4b7d38741eb9d773bacda42f6b779f5a286257bf88993c6007250abc8/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.226663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.228246 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.233395 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.239176 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klrns\" (UniqueName: \"kubernetes.io/projected/1723b7a4-a96d-4144-b4cb-3e5735a38667-kube-api-access-klrns\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.260566 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.308058 4722 generic.go:334] "Generic (PLEG): container finished" podID="b8a5702a-6bfd-4f8d-a522-f0460c092b52" containerID="deced704a3f40b9c7d276308aecb3a6d761c83341556aa3c96ad830a15d091b8" exitCode=0 Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.308175 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b8gvr" event={"ID":"b8a5702a-6bfd-4f8d-a522-f0460c092b52","Type":"ContainerDied","Data":"deced704a3f40b9c7d276308aecb3a6d761c83341556aa3c96ad830a15d091b8"} Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.324257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.367682 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.136715 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535614-l66lm"] Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.138306 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.140327 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.140751 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.141338 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.167015 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535614-l66lm"] Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.238770 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78vl4\" (UniqueName: \"kubernetes.io/projected/a81e036d-5879-4813-bfda-9a203246b1e3-kube-api-access-78vl4\") pod \"auto-csr-approver-29535614-l66lm\" (UID: \"a81e036d-5879-4813-bfda-9a203246b1e3\") " pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.347452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78vl4\" (UniqueName: \"kubernetes.io/projected/a81e036d-5879-4813-bfda-9a203246b1e3-kube-api-access-78vl4\") pod \"auto-csr-approver-29535614-l66lm\" (UID: \"a81e036d-5879-4813-bfda-9a203246b1e3\") " pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.383465 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78vl4\" (UniqueName: \"kubernetes.io/projected/a81e036d-5879-4813-bfda-9a203246b1e3-kube-api-access-78vl4\") pod \"auto-csr-approver-29535614-l66lm\" (UID: \"a81e036d-5879-4813-bfda-9a203246b1e3\") " pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.464475 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.579178 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.659602 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:06 crc kubenswrapper[4722]: I0226 20:14:06.386571 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.801847 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.809204 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962123 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-config\") pod \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962474 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-sb\") pod \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962563 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-dns-svc\") pod \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962677 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-config\") pod \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962721 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-nb\") pod \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962769 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh8tn\" (UniqueName: \"kubernetes.io/projected/b8a5702a-6bfd-4f8d-a522-f0460c092b52-kube-api-access-nh8tn\") pod \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-combined-ca-bundle\") pod \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962877 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6xwr\" (UniqueName: \"kubernetes.io/projected/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-kube-api-access-v6xwr\") pod \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.966552 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a5702a-6bfd-4f8d-a522-f0460c092b52-kube-api-access-nh8tn" (OuterVolumeSpecName: "kube-api-access-nh8tn") pod "b8a5702a-6bfd-4f8d-a522-f0460c092b52" (UID: "b8a5702a-6bfd-4f8d-a522-f0460c092b52"). InnerVolumeSpecName "kube-api-access-nh8tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.983717 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-kube-api-access-v6xwr" (OuterVolumeSpecName: "kube-api-access-v6xwr") pod "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" (UID: "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40"). InnerVolumeSpecName "kube-api-access-v6xwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.989897 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-config" (OuterVolumeSpecName: "config") pod "b8a5702a-6bfd-4f8d-a522-f0460c092b52" (UID: "b8a5702a-6bfd-4f8d-a522-f0460c092b52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.992259 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8a5702a-6bfd-4f8d-a522-f0460c092b52" (UID: "b8a5702a-6bfd-4f8d-a522-f0460c092b52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.009682 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" (UID: "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.011760 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" (UID: "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.013549 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-config" (OuterVolumeSpecName: "config") pod "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" (UID: "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.013837 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" (UID: "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067099 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067199 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067226 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh8tn\" (UniqueName: \"kubernetes.io/projected/b8a5702a-6bfd-4f8d-a522-f0460c092b52-kube-api-access-nh8tn\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067237 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067247 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6xwr\" (UniqueName: \"kubernetes.io/projected/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-kube-api-access-v6xwr\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067257 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067267 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067276 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.413798 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" event={"ID":"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40","Type":"ContainerDied","Data":"cf16f1dda34bfcc17c892a970aa5367685f77067da2bbedfa81960093267432f"} Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.414159 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.414259 4722 scope.go:117] "RemoveContainer" containerID="4a129b8c1723572fe4add0f6ebd0ad819a9755d241f4d2d09aa4fac6abaef325" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.416066 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b8gvr" event={"ID":"b8a5702a-6bfd-4f8d-a522-f0460c092b52","Type":"ContainerDied","Data":"abbdb40762fd75bc7aee34dc669ccdafcd3271e6b81137a3963f9d0f7a91f1d3"} Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.416282 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.416283 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abbdb40762fd75bc7aee34dc669ccdafcd3271e6b81137a3963f9d0f7a91f1d3" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.449341 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6v647"] Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.453880 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6v647"] Feb 26 20:14:10 crc kubenswrapper[4722]: E0226 20:14:10.928274 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 26 20:14:10 crc kubenswrapper[4722]: E0226 20:14:10.928557 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pd5tb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-m2kjh_openstack(0f37d21c-75cb-471a-b68c-db4207ba0f6b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:14:10 crc kubenswrapper[4722]: E0226 20:14:10.930190 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-m2kjh" podUID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.077082 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bcw66"] Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.116578 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-z66hr"] Feb 26 20:14:11 crc kubenswrapper[4722]: E0226 20:14:11.116931 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="init" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.116946 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="init" Feb 26 20:14:11 crc kubenswrapper[4722]: E0226 20:14:11.116959 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a5702a-6bfd-4f8d-a522-f0460c092b52" containerName="neutron-db-sync" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.116965 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a5702a-6bfd-4f8d-a522-f0460c092b52" containerName="neutron-db-sync" Feb 26 20:14:11 crc kubenswrapper[4722]: E0226 20:14:11.116974 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.116980 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.117176 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.117188 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a5702a-6bfd-4f8d-a522-f0460c092b52" containerName="neutron-db-sync" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.120107 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.134758 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-z66hr"] Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.196497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.196847 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-config\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.196993 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.197095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gxxf\" (UniqueName: \"kubernetes.io/projected/50752c02-e94a-4695-b201-5acd8e4fd7b9-kube-api-access-6gxxf\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.197263 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.197352 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.231093 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7b7cfb9b54-qvhbm"] Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.237064 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.239708 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b7cfb9b54-qvhbm"] Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.242708 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.243156 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.243306 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.243447 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6xzhb" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.299032 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-config\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.299173 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.299224 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gxxf\" (UniqueName: \"kubernetes.io/projected/50752c02-e94a-4695-b201-5acd8e4fd7b9-kube-api-access-6gxxf\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.299253 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.299293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.299420 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.300225 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.300792 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-config\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.301921 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.302712 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.303324 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.321823 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gxxf\" (UniqueName: \"kubernetes.io/projected/50752c02-e94a-4695-b201-5acd8e4fd7b9-kube-api-access-6gxxf\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.387626 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.400487 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-ovndb-tls-certs\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.400537 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-combined-ca-bundle\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.400576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-httpd-config\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.400643 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n56pn\" (UniqueName: \"kubernetes.io/projected/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-kube-api-access-n56pn\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.400720 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-config\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: E0226 20:14:11.429463 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-m2kjh" podUID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.454735 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.521021 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-config\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.521211 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-ovndb-tls-certs\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.521256 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-combined-ca-bundle\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.521363 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-httpd-config\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.521554 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n56pn\" (UniqueName: \"kubernetes.io/projected/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-kube-api-access-n56pn\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.528314 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-ovndb-tls-certs\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.528985 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-combined-ca-bundle\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.537126 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-httpd-config\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.537309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-config\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.546325 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n56pn\" (UniqueName: \"kubernetes.io/projected/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-kube-api-access-n56pn\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.571121 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:12 crc kubenswrapper[4722]: I0226 20:14:12.157125 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" path="/var/lib/kubelet/pods/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40/volumes" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.288987 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-647dc79bf7-sr259"] Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.291924 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.295583 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.296052 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.305791 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-647dc79bf7-sr259"] Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.454886 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qt84\" (UniqueName: \"kubernetes.io/projected/724a51e1-b819-4615-8626-f2d5e69e6798-kube-api-access-4qt84\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.454962 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-httpd-config\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.454988 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-public-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.455022 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-config\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.455065 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-ovndb-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.455103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-combined-ca-bundle\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.455118 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-internal-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-config\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556568 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-ovndb-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556614 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-combined-ca-bundle\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556633 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-internal-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556704 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qt84\" (UniqueName: \"kubernetes.io/projected/724a51e1-b819-4615-8626-f2d5e69e6798-kube-api-access-4qt84\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556749 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-httpd-config\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556772 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-public-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.563463 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-public-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.563524 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-combined-ca-bundle\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.564474 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-httpd-config\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.569457 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-config\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.571426 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-internal-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.588961 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-ovndb-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.589706 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qt84\" (UniqueName: \"kubernetes.io/projected/724a51e1-b819-4615-8626-f2d5e69e6798-kube-api-access-4qt84\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.627619 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:14 crc kubenswrapper[4722]: I0226 20:14:14.896041 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7s744"] Feb 26 20:14:15 crc kubenswrapper[4722]: E0226 20:14:15.235910 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 26 20:14:15 crc kubenswrapper[4722]: E0226 20:14:15.235962 4722 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 26 20:14:15 crc kubenswrapper[4722]: E0226 20:14:15.236085 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4g2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-9bqd7_openstack(04f47952-580e-40b8-80f0-25d1bf8ccc22): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:14:15 crc kubenswrapper[4722]: E0226 20:14:15.237351 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-9bqd7" podUID="04f47952-580e-40b8-80f0-25d1bf8ccc22" Feb 26 20:14:15 crc kubenswrapper[4722]: I0226 20:14:15.241281 4722 scope.go:117] "RemoveContainer" containerID="f496200801d5a8d3ad48ad4beed803937d066c9796fef300a5c24e89fc2e832c" Feb 26 20:14:15 crc kubenswrapper[4722]: I0226 20:14:15.468016 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7s744" event={"ID":"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd","Type":"ContainerStarted","Data":"08248d7a81e17066fc3accf62dada8690f451b574470782d2b73af602129f9b2"} Feb 26 20:14:15 crc kubenswrapper[4722]: E0226 20:14:15.501053 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-9bqd7" podUID="04f47952-580e-40b8-80f0-25d1bf8ccc22" Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.123229 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535614-l66lm"] Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.160931 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bcw66"] Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.248620 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.307647 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b7cfb9b54-qvhbm"] Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.359355 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-z66hr"] Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.424206 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-647dc79bf7-sr259"] Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.521863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" event={"ID":"00da7f47-5a02-488d-99df-113c54217bcc","Type":"ContainerStarted","Data":"af80be43a4f759336c063262cda6ee0c65a1cee2d4142bf9a200e025984a966f"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.524719 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" event={"ID":"50752c02-e94a-4695-b201-5acd8e4fd7b9","Type":"ContainerStarted","Data":"e9f3cf3dd8ff0a42728cf74bba57287fbde3911334c60b0bdae4dd8003e33c2b"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.529000 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerStarted","Data":"e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.530748 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7s744" event={"ID":"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd","Type":"ContainerStarted","Data":"97394bca4ea0c756dd461895d13cc98071cd1ae10d211edf48b9975466675e66"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.534605 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1723b7a4-a96d-4144-b4cb-3e5735a38667","Type":"ContainerStarted","Data":"970f41597f024532c87bbd96c198d180f46552c5264cf6816ff381f43bcd3b63"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.571361 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7s744" podStartSLOduration=29.571343367 podStartE2EDuration="29.571343367s" podCreationTimestamp="2026-02-26 20:13:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:16.570587446 +0000 UTC m=+1199.107555380" watchObservedRunningTime="2026-02-26 20:14:16.571343367 +0000 UTC m=+1199.108311311" Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.584470 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cfb9b54-qvhbm" event={"ID":"7810fb24-84d9-45c8-9456-7d1a6c6c8fff","Type":"ContainerStarted","Data":"13fdf3fbbd44bcdef851fc4937da95414f9511f1d58caad15216959bbf0ce9d4"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.613437 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h94hg" event={"ID":"f7f3da1b-cb51-4235-8d61-d44ba069528c","Type":"ContainerStarted","Data":"734339de91bbe566cfffe0c05354b3aba86711ba90cb18f3d5f79f4227b2a8ec"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.665511 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535614-l66lm" event={"ID":"a81e036d-5879-4813-bfda-9a203246b1e3","Type":"ContainerStarted","Data":"96fa37cde054f99ab66cdafbfd0ae83ed6dfb3888b4e601a45d6d04638c2134c"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.705409 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-79m6p" event={"ID":"3d551533-7396-4941-a62c-b1a0039f6ddc","Type":"ContainerStarted","Data":"623be980e1214808cc0408f41f7691791f486241d01b7de06517e0138a9aa1ed"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.726008 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-h94hg" podStartSLOduration=10.07390563 podStartE2EDuration="39.725993625s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="2026-02-26 20:13:40.055276604 +0000 UTC m=+1162.592244518" lastFinishedPulling="2026-02-26 20:14:09.707364589 +0000 UTC m=+1192.244332513" observedRunningTime="2026-02-26 20:14:16.68332363 +0000 UTC m=+1199.220291554" watchObservedRunningTime="2026-02-26 20:14:16.725993625 +0000 UTC m=+1199.262961549" Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.850229 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-79m6p" podStartSLOduration=4.441062116 podStartE2EDuration="39.850210799s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="2026-02-26 20:13:40.019530495 +0000 UTC m=+1162.556498409" lastFinishedPulling="2026-02-26 20:14:15.428679158 +0000 UTC m=+1197.965647092" observedRunningTime="2026-02-26 20:14:16.752612786 +0000 UTC m=+1199.289580710" watchObservedRunningTime="2026-02-26 20:14:16.850210799 +0000 UTC m=+1199.387178723" Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.859452 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.716993 4722 generic.go:334] "Generic (PLEG): container finished" podID="00da7f47-5a02-488d-99df-113c54217bcc" containerID="c45b80126aab15829f2b4d270d1260b38a04645db92ff3b87943cdb7c5b73d4b" exitCode=0 Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.717086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" event={"ID":"00da7f47-5a02-488d-99df-113c54217bcc","Type":"ContainerDied","Data":"c45b80126aab15829f2b4d270d1260b38a04645db92ff3b87943cdb7c5b73d4b"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.719637 4722 generic.go:334] "Generic (PLEG): container finished" podID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerID="e9b196481b1b215f1ce842f3ea62a0f77fec1a30a4e49418cef5f3cfc41f3fd6" exitCode=0 Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.719691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" event={"ID":"50752c02-e94a-4695-b201-5acd8e4fd7b9","Type":"ContainerDied","Data":"e9b196481b1b215f1ce842f3ea62a0f77fec1a30a4e49418cef5f3cfc41f3fd6"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.724115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647dc79bf7-sr259" event={"ID":"724a51e1-b819-4615-8626-f2d5e69e6798","Type":"ContainerStarted","Data":"fac8fedd4f876a15ec465e90d03935b71e6621e396916ae5147c867d4c9a484e"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.724190 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647dc79bf7-sr259" event={"ID":"724a51e1-b819-4615-8626-f2d5e69e6798","Type":"ContainerStarted","Data":"ab508fb68b314fd1c841ead8a41612709fe3cda3d4dc611dccaf5dabae8c1777"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.724200 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647dc79bf7-sr259" event={"ID":"724a51e1-b819-4615-8626-f2d5e69e6798","Type":"ContainerStarted","Data":"2a0f4b08ed52374b7cc2281865a14714c885f9f0925762a095546b787fd0453f"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.724980 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.727328 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1723b7a4-a96d-4144-b4cb-3e5735a38667","Type":"ContainerStarted","Data":"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.728613 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc86f06d-19f3-419d-bcf3-97376fb95f01","Type":"ContainerStarted","Data":"469790b4ccb3952f93ad209afa61d42c13d1474a992731dad11be45f579cfe39"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.728636 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc86f06d-19f3-419d-bcf3-97376fb95f01","Type":"ContainerStarted","Data":"ff4cde58f299f02631a9d16ca39b9e73725979d99ee58934a90a37584ac923d7"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.732517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cfb9b54-qvhbm" event={"ID":"7810fb24-84d9-45c8-9456-7d1a6c6c8fff","Type":"ContainerStarted","Data":"bca49d1b838a18d1e10c67c19f2c179615b47cb6748f17ba06e8df22c0228995"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.732565 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cfb9b54-qvhbm" event={"ID":"7810fb24-84d9-45c8-9456-7d1a6c6c8fff","Type":"ContainerStarted","Data":"3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.732699 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.809324 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-647dc79bf7-sr259" podStartSLOduration=4.8093054760000005 podStartE2EDuration="4.809305476s" podCreationTimestamp="2026-02-26 20:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:17.779896099 +0000 UTC m=+1200.316864033" watchObservedRunningTime="2026-02-26 20:14:17.809305476 +0000 UTC m=+1200.346273400" Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.850559 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7b7cfb9b54-qvhbm" podStartSLOduration=6.850536872 podStartE2EDuration="6.850536872s" podCreationTimestamp="2026-02-26 20:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:17.837956342 +0000 UTC m=+1200.374924266" watchObservedRunningTime="2026-02-26 20:14:17.850536872 +0000 UTC m=+1200.387504796" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.510888 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.613320 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-config\") pod \"00da7f47-5a02-488d-99df-113c54217bcc\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.613405 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-svc\") pod \"00da7f47-5a02-488d-99df-113c54217bcc\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.613432 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9t8v\" (UniqueName: \"kubernetes.io/projected/00da7f47-5a02-488d-99df-113c54217bcc-kube-api-access-n9t8v\") pod \"00da7f47-5a02-488d-99df-113c54217bcc\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.613477 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-sb\") pod \"00da7f47-5a02-488d-99df-113c54217bcc\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.613544 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-nb\") pod \"00da7f47-5a02-488d-99df-113c54217bcc\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.613570 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-swift-storage-0\") pod \"00da7f47-5a02-488d-99df-113c54217bcc\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.618861 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00da7f47-5a02-488d-99df-113c54217bcc-kube-api-access-n9t8v" (OuterVolumeSpecName: "kube-api-access-n9t8v") pod "00da7f47-5a02-488d-99df-113c54217bcc" (UID: "00da7f47-5a02-488d-99df-113c54217bcc"). InnerVolumeSpecName "kube-api-access-n9t8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.725387 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9t8v\" (UniqueName: \"kubernetes.io/projected/00da7f47-5a02-488d-99df-113c54217bcc-kube-api-access-n9t8v\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.774884 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00da7f47-5a02-488d-99df-113c54217bcc" (UID: "00da7f47-5a02-488d-99df-113c54217bcc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.790098 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" event={"ID":"00da7f47-5a02-488d-99df-113c54217bcc","Type":"ContainerDied","Data":"af80be43a4f759336c063262cda6ee0c65a1cee2d4142bf9a200e025984a966f"} Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.790214 4722 scope.go:117] "RemoveContainer" containerID="c45b80126aab15829f2b4d270d1260b38a04645db92ff3b87943cdb7c5b73d4b" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.790317 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.797308 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00da7f47-5a02-488d-99df-113c54217bcc" (UID: "00da7f47-5a02-488d-99df-113c54217bcc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.827481 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.827517 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.834033 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00da7f47-5a02-488d-99df-113c54217bcc" (UID: "00da7f47-5a02-488d-99df-113c54217bcc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.842192 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-config" (OuterVolumeSpecName: "config") pod "00da7f47-5a02-488d-99df-113c54217bcc" (UID: "00da7f47-5a02-488d-99df-113c54217bcc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.849218 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "00da7f47-5a02-488d-99df-113c54217bcc" (UID: "00da7f47-5a02-488d-99df-113c54217bcc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.932527 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.932563 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.932572 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:19 crc kubenswrapper[4722]: I0226 20:14:19.145313 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bcw66"] Feb 26 20:14:19 crc kubenswrapper[4722]: I0226 20:14:19.156091 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bcw66"] Feb 26 20:14:20 crc kubenswrapper[4722]: I0226 20:14:20.156872 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00da7f47-5a02-488d-99df-113c54217bcc" path="/var/lib/kubelet/pods/00da7f47-5a02-488d-99df-113c54217bcc/volumes" Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.485765 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.819061 4722 generic.go:334] "Generic (PLEG): container finished" podID="89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" containerID="97394bca4ea0c756dd461895d13cc98071cd1ae10d211edf48b9975466675e66" exitCode=0 Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.819124 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7s744" event={"ID":"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd","Type":"ContainerDied","Data":"97394bca4ea0c756dd461895d13cc98071cd1ae10d211edf48b9975466675e66"} Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.820556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535614-l66lm" event={"ID":"a81e036d-5879-4813-bfda-9a203246b1e3","Type":"ContainerStarted","Data":"5c403af37ceeca345c4731b4b5131a0a804ef482ec06690bdc4bee17d0817b04"} Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.823768 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" event={"ID":"50752c02-e94a-4695-b201-5acd8e4fd7b9","Type":"ContainerStarted","Data":"794e81edf7664e789281221c615dfae434fe4a80f1535a733f3fd9cfbecf2274"} Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.823876 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.826110 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerStarted","Data":"30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0"} Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.828288 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1723b7a4-a96d-4144-b4cb-3e5735a38667","Type":"ContainerStarted","Data":"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b"} Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.828416 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-httpd" containerID="cri-o://d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b" gracePeriod=30 Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.828406 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-log" containerID="cri-o://5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b" gracePeriod=30 Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.832315 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc86f06d-19f3-419d-bcf3-97376fb95f01","Type":"ContainerStarted","Data":"ef8572ab2daa02fb67595d9843ffe045ed368e778432753fee426ee2ff72b712"} Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.832471 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-log" containerID="cri-o://469790b4ccb3952f93ad209afa61d42c13d1474a992731dad11be45f579cfe39" gracePeriod=30 Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.832574 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-httpd" containerID="cri-o://ef8572ab2daa02fb67595d9843ffe045ed368e778432753fee426ee2ff72b712" gracePeriod=30 Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.870354 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=24.870326387 podStartE2EDuration="24.870326387s" podCreationTimestamp="2026-02-26 20:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:21.858359523 +0000 UTC m=+1204.395327447" watchObservedRunningTime="2026-02-26 20:14:21.870326387 +0000 UTC m=+1204.407294311" Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.887688 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535614-l66lm" podStartSLOduration=16.773441839 podStartE2EDuration="21.887669316s" podCreationTimestamp="2026-02-26 20:14:00 +0000 UTC" firstStartedPulling="2026-02-26 20:14:16.130061864 +0000 UTC m=+1198.667029788" lastFinishedPulling="2026-02-26 20:14:21.244289321 +0000 UTC m=+1203.781257265" observedRunningTime="2026-02-26 20:14:21.879266539 +0000 UTC m=+1204.416234463" watchObservedRunningTime="2026-02-26 20:14:21.887669316 +0000 UTC m=+1204.424637240" Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.905594 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" podStartSLOduration=10.905572902 podStartE2EDuration="10.905572902s" podCreationTimestamp="2026-02-26 20:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:21.893291899 +0000 UTC m=+1204.430259823" watchObservedRunningTime="2026-02-26 20:14:21.905572902 +0000 UTC m=+1204.442540836" Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.927222 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=24.927205308 podStartE2EDuration="24.927205308s" podCreationTimestamp="2026-02-26 20:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:21.913530707 +0000 UTC m=+1204.450498631" watchObservedRunningTime="2026-02-26 20:14:21.927205308 +0000 UTC m=+1204.464173232" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.714462 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.833842 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-scripts\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.834085 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.834221 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klrns\" (UniqueName: \"kubernetes.io/projected/1723b7a4-a96d-4144-b4cb-3e5735a38667-kube-api-access-klrns\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.834496 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-combined-ca-bundle\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.834562 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-config-data\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.834615 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-logs\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.834778 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-httpd-run\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.835884 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.836716 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-logs" (OuterVolumeSpecName: "logs") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.842341 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1723b7a4-a96d-4144-b4cb-3e5735a38667-kube-api-access-klrns" (OuterVolumeSpecName: "kube-api-access-klrns") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "kube-api-access-klrns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.843350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-scripts" (OuterVolumeSpecName: "scripts") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.869362 4722 generic.go:334] "Generic (PLEG): container finished" podID="f7f3da1b-cb51-4235-8d61-d44ba069528c" containerID="734339de91bbe566cfffe0c05354b3aba86711ba90cb18f3d5f79f4227b2a8ec" exitCode=0 Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.869427 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h94hg" event={"ID":"f7f3da1b-cb51-4235-8d61-d44ba069528c","Type":"ContainerDied","Data":"734339de91bbe566cfffe0c05354b3aba86711ba90cb18f3d5f79f4227b2a8ec"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.875101 4722 generic.go:334] "Generic (PLEG): container finished" podID="a81e036d-5879-4813-bfda-9a203246b1e3" containerID="5c403af37ceeca345c4731b4b5131a0a804ef482ec06690bdc4bee17d0817b04" exitCode=0 Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.875213 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535614-l66lm" event={"ID":"a81e036d-5879-4813-bfda-9a203246b1e3","Type":"ContainerDied","Data":"5c403af37ceeca345c4731b4b5131a0a804ef482ec06690bdc4bee17d0817b04"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.880108 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d" (OuterVolumeSpecName: "glance") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "pvc-b7104307-bea6-42a8-bb91-b3367a15255d". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.882342 4722 generic.go:334] "Generic (PLEG): container finished" podID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerID="ef8572ab2daa02fb67595d9843ffe045ed368e778432753fee426ee2ff72b712" exitCode=0 Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.882365 4722 generic.go:334] "Generic (PLEG): container finished" podID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerID="469790b4ccb3952f93ad209afa61d42c13d1474a992731dad11be45f579cfe39" exitCode=143 Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.882415 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc86f06d-19f3-419d-bcf3-97376fb95f01","Type":"ContainerDied","Data":"ef8572ab2daa02fb67595d9843ffe045ed368e778432753fee426ee2ff72b712"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.882436 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc86f06d-19f3-419d-bcf3-97376fb95f01","Type":"ContainerDied","Data":"469790b4ccb3952f93ad209afa61d42c13d1474a992731dad11be45f579cfe39"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.891419 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895180 4722 generic.go:334] "Generic (PLEG): container finished" podID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerID="d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b" exitCode=0 Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895206 4722 generic.go:334] "Generic (PLEG): container finished" podID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerID="5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b" exitCode=143 Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895399 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895840 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1723b7a4-a96d-4144-b4cb-3e5735a38667","Type":"ContainerDied","Data":"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895874 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1723b7a4-a96d-4144-b4cb-3e5735a38667","Type":"ContainerDied","Data":"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895888 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1723b7a4-a96d-4144-b4cb-3e5735a38667","Type":"ContainerDied","Data":"970f41597f024532c87bbd96c198d180f46552c5264cf6816ff381f43bcd3b63"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895904 4722 scope.go:117] "RemoveContainer" containerID="d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.937268 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.937333 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") on node \"crc\" " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.937352 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klrns\" (UniqueName: \"kubernetes.io/projected/1723b7a4-a96d-4144-b4cb-3e5735a38667-kube-api-access-klrns\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.937365 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.937379 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.937390 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.962565 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.962904 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b7104307-bea6-42a8-bb91-b3367a15255d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d") on node "crc" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.965497 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-config-data" (OuterVolumeSpecName: "config-data") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.966169 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.994603 4722 scope.go:117] "RemoveContainer" containerID="5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.015861 4722 scope.go:117] "RemoveContainer" containerID="d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b" Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.016575 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b\": container with ID starting with d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b not found: ID does not exist" containerID="d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.016612 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b"} err="failed to get container status \"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b\": rpc error: code = NotFound desc = could not find container \"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b\": container with ID starting with d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b not found: ID does not exist" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.016630 4722 scope.go:117] "RemoveContainer" containerID="5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b" Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.016811 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b\": container with ID starting with 5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b not found: ID does not exist" containerID="5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.016829 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b"} err="failed to get container status \"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b\": rpc error: code = NotFound desc = could not find container \"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b\": container with ID starting with 5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b not found: ID does not exist" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.016840 4722 scope.go:117] "RemoveContainer" containerID="d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.017041 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b"} err="failed to get container status \"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b\": rpc error: code = NotFound desc = could not find container \"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b\": container with ID starting with d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b not found: ID does not exist" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.017060 4722 scope.go:117] "RemoveContainer" containerID="5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.017346 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b"} err="failed to get container status \"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b\": rpc error: code = NotFound desc = could not find container \"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b\": container with ID starting with 5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b not found: ID does not exist" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.039246 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-config-data\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.039290 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-logs\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.039432 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.039486 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-httpd-run\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.039527 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsccz\" (UniqueName: \"kubernetes.io/projected/fc86f06d-19f3-419d-bcf3-97376fb95f01-kube-api-access-jsccz\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.039548 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-combined-ca-bundle\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.043291 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-scripts\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.044364 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.044392 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.045420 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.048602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-logs" (OuterVolumeSpecName: "logs") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.060687 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc86f06d-19f3-419d-bcf3-97376fb95f01-kube-api-access-jsccz" (OuterVolumeSpecName: "kube-api-access-jsccz") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "kube-api-access-jsccz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.065477 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-scripts" (OuterVolumeSpecName: "scripts") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.103318 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0" (OuterVolumeSpecName: "glance") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "pvc-c3598451-3b65-4991-9779-75a64db7d9c0". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.118074 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.118245 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-config-data" (OuterVolumeSpecName: "config-data") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146456 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146482 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146492 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146517 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") on node \"crc\" " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146527 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146537 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsccz\" (UniqueName: \"kubernetes.io/projected/fc86f06d-19f3-419d-bcf3-97376fb95f01-kube-api-access-jsccz\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146546 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.192893 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.193107 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c3598451-3b65-4991-9779-75a64db7d9c0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0") on node "crc" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.252778 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.323352 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.342316 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.352399 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.352842 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-httpd" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.352854 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-httpd" Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.352869 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-log" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.352875 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-log" Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.352890 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-log" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.352896 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-log" Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.352909 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00da7f47-5a02-488d-99df-113c54217bcc" containerName="init" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.352915 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="00da7f47-5a02-488d-99df-113c54217bcc" containerName="init" Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.352927 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-httpd" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.352934 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-httpd" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.353156 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-httpd" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.353175 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-log" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.353188 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-log" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.353196 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="00da7f47-5a02-488d-99df-113c54217bcc" containerName="init" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.353204 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-httpd" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.354441 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.358820 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.359099 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.361346 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458443 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458491 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458530 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458560 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458585 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-logs\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458656 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/6f846463-6d0b-474c-bb69-05430903325e-kube-api-access-xhppn\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458692 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.487848 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.487905 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.487953 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.488808 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c21285f0689404c517f73494c8146ae2d9c77c8869bf3913d36029a321066ed"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.488874 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://0c21285f0689404c517f73494c8146ae2d9c77c8869bf3913d36029a321066ed" gracePeriod=600 Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.553569 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7s744" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.560483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/6f846463-6d0b-474c-bb69-05430903325e-kube-api-access-xhppn\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.560581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.561256 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.563007 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.563075 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.563117 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.563153 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-logs\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.563194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.563942 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-logs\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.564346 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.566245 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.566332 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/88f1a6e4b7d38741eb9d773bacda42f6b779f5a286257bf88993c6007250abc8/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.566661 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.567960 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.569951 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.582368 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/6f846463-6d0b-474c-bb69-05430903325e-kube-api-access-xhppn\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.593105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.650286 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.664994 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-credential-keys\") pod \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.665231 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-config-data\") pod \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.665296 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6b86\" (UniqueName: \"kubernetes.io/projected/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-kube-api-access-m6b86\") pod \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.665431 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-scripts\") pod \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.665497 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-combined-ca-bundle\") pod \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.665524 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-fernet-keys\") pod \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.670458 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" (UID: "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.670544 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" (UID: "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.670686 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-kube-api-access-m6b86" (OuterVolumeSpecName: "kube-api-access-m6b86") pod "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" (UID: "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd"). InnerVolumeSpecName "kube-api-access-m6b86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.672128 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-scripts" (OuterVolumeSpecName: "scripts") pod "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" (UID: "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.679683 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.703748 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-config-data" (OuterVolumeSpecName: "config-data") pod "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" (UID: "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.730175 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" (UID: "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.770380 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.770412 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.770424 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.770432 4722 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.770440 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.770449 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6b86\" (UniqueName: \"kubernetes.io/projected/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-kube-api-access-m6b86\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.907551 4722 generic.go:334] "Generic (PLEG): container finished" podID="3d551533-7396-4941-a62c-b1a0039f6ddc" containerID="623be980e1214808cc0408f41f7691791f486241d01b7de06517e0138a9aa1ed" exitCode=0 Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.907760 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-79m6p" event={"ID":"3d551533-7396-4941-a62c-b1a0039f6ddc","Type":"ContainerDied","Data":"623be980e1214808cc0408f41f7691791f486241d01b7de06517e0138a9aa1ed"} Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.914610 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc86f06d-19f3-419d-bcf3-97376fb95f01","Type":"ContainerDied","Data":"ff4cde58f299f02631a9d16ca39b9e73725979d99ee58934a90a37584ac923d7"} Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.914657 4722 scope.go:117] "RemoveContainer" containerID="ef8572ab2daa02fb67595d9843ffe045ed368e778432753fee426ee2ff72b712" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.914729 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.937416 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7s744" event={"ID":"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd","Type":"ContainerDied","Data":"08248d7a81e17066fc3accf62dada8690f451b574470782d2b73af602129f9b2"} Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.937453 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08248d7a81e17066fc3accf62dada8690f451b574470782d2b73af602129f9b2" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.937509 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7s744" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.961952 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="0c21285f0689404c517f73494c8146ae2d9c77c8869bf3913d36029a321066ed" exitCode=0 Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.962056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"0c21285f0689404c517f73494c8146ae2d9c77c8869bf3913d36029a321066ed"} Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.962102 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"c6d778fad2f2151e0aabde662094a8e54f4922234ea2496f6de56c2b4fb7262f"} Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.010638 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.014254 4722 scope.go:117] "RemoveContainer" containerID="469790b4ccb3952f93ad209afa61d42c13d1474a992731dad11be45f579cfe39" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.028686 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.044129 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:24 crc kubenswrapper[4722]: E0226 20:14:24.044586 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" containerName="keystone-bootstrap" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.044605 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" containerName="keystone-bootstrap" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.044795 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" containerName="keystone-bootstrap" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.045846 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.050292 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.050513 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.065866 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7db9cf967f-jqqzk"] Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.067195 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.074918 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v8sf5" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.075070 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.108990 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.109356 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.110328 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.122554 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.125966 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.164490 4722 scope.go:117] "RemoveContainer" containerID="28eb66ca582ac12b359d92edbe11f70ad050a32628a627f71feab854f56a89c5" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.193847 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" path="/var/lib/kubelet/pods/1723b7a4-a96d-4144-b4cb-3e5735a38667/volumes" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.201482 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" path="/var/lib/kubelet/pods/fc86f06d-19f3-419d-bcf3-97376fb95f01/volumes" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.202237 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7db9cf967f-jqqzk"] Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229633 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-config-data\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229655 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkpb\" (UniqueName: \"kubernetes.io/projected/783243ef-530a-418a-98b7-9f781077e95a-kube-api-access-lzkpb\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229693 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-logs\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229717 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229750 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229766 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-credential-keys\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229785 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-combined-ca-bundle\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229817 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229842 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-public-tls-certs\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.234795 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-scripts\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.234833 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.234905 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-fernet-keys\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.234923 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.234947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-internal-tls-certs\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.235044 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfgtl\" (UniqueName: \"kubernetes.io/projected/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-kube-api-access-wfgtl\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-scripts\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336794 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336863 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-fernet-keys\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336879 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-internal-tls-certs\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336923 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfgtl\" (UniqueName: \"kubernetes.io/projected/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-kube-api-access-wfgtl\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336943 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336971 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-config-data\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkpb\" (UniqueName: \"kubernetes.io/projected/783243ef-530a-418a-98b7-9f781077e95a-kube-api-access-lzkpb\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337036 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-logs\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337126 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337154 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-credential-keys\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337171 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-combined-ca-bundle\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337204 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337221 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-public-tls-certs\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.341409 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.341435 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-logs\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.346257 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.346301 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7f95abf803007e35619a86adf06d86b927c4178d94ba29cbe93b3d6d49c63693/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.349413 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-internal-tls-certs\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.354518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.357366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.358865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-config-data\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.361655 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-fernet-keys\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.366235 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.366754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-credential-keys\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.367335 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-scripts\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.367473 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-public-tls-certs\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.368924 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-combined-ca-bundle\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.369913 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkpb\" (UniqueName: \"kubernetes.io/projected/783243ef-530a-418a-98b7-9f781077e95a-kube-api-access-lzkpb\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.382851 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.385970 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfgtl\" (UniqueName: \"kubernetes.io/projected/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-kube-api-access-wfgtl\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.480484 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.485489 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.510370 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:24 crc kubenswrapper[4722]: E0226 20:14:24.539541 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89f1a3d4_7c9d_4fb4_9d0c_4cbef841c7dd.slice/crio-08248d7a81e17066fc3accf62dada8690f451b574470782d2b73af602129f9b2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89f1a3d4_7c9d_4fb4_9d0c_4cbef841c7dd.slice\": RecentStats: unable to find data in memory cache]" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.710057 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.843068 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h94hg" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.872634 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.001699 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm6gh\" (UniqueName: \"kubernetes.io/projected/f7f3da1b-cb51-4235-8d61-d44ba069528c-kube-api-access-xm6gh\") pod \"f7f3da1b-cb51-4235-8d61-d44ba069528c\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002046 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-combined-ca-bundle\") pod \"f7f3da1b-cb51-4235-8d61-d44ba069528c\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002097 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f3da1b-cb51-4235-8d61-d44ba069528c-logs\") pod \"f7f3da1b-cb51-4235-8d61-d44ba069528c\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002185 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-config-data\") pod \"f7f3da1b-cb51-4235-8d61-d44ba069528c\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-scripts\") pod \"f7f3da1b-cb51-4235-8d61-d44ba069528c\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78vl4\" (UniqueName: \"kubernetes.io/projected/a81e036d-5879-4813-bfda-9a203246b1e3-kube-api-access-78vl4\") pod \"a81e036d-5879-4813-bfda-9a203246b1e3\" (UID: \"a81e036d-5879-4813-bfda-9a203246b1e3\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002673 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7f3da1b-cb51-4235-8d61-d44ba069528c-logs" (OuterVolumeSpecName: "logs") pod "f7f3da1b-cb51-4235-8d61-d44ba069528c" (UID: "f7f3da1b-cb51-4235-8d61-d44ba069528c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002917 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f3da1b-cb51-4235-8d61-d44ba069528c-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.019371 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535614-l66lm" event={"ID":"a81e036d-5879-4813-bfda-9a203246b1e3","Type":"ContainerDied","Data":"96fa37cde054f99ab66cdafbfd0ae83ed6dfb3888b4e601a45d6d04638c2134c"} Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.019701 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96fa37cde054f99ab66cdafbfd0ae83ed6dfb3888b4e601a45d6d04638c2134c" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.019771 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.030819 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f846463-6d0b-474c-bb69-05430903325e","Type":"ContainerStarted","Data":"e97085fd9ae89289f551beeee4068908739305a6ac14a94c20bf0771fae8222b"} Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.031074 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7f3da1b-cb51-4235-8d61-d44ba069528c-kube-api-access-xm6gh" (OuterVolumeSpecName: "kube-api-access-xm6gh") pod "f7f3da1b-cb51-4235-8d61-d44ba069528c" (UID: "f7f3da1b-cb51-4235-8d61-d44ba069528c"). InnerVolumeSpecName "kube-api-access-xm6gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.036405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-scripts" (OuterVolumeSpecName: "scripts") pod "f7f3da1b-cb51-4235-8d61-d44ba069528c" (UID: "f7f3da1b-cb51-4235-8d61-d44ba069528c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.037125 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81e036d-5879-4813-bfda-9a203246b1e3-kube-api-access-78vl4" (OuterVolumeSpecName: "kube-api-access-78vl4") pod "a81e036d-5879-4813-bfda-9a203246b1e3" (UID: "a81e036d-5879-4813-bfda-9a203246b1e3"). InnerVolumeSpecName "kube-api-access-78vl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.086361 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7f3da1b-cb51-4235-8d61-d44ba069528c" (UID: "f7f3da1b-cb51-4235-8d61-d44ba069528c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.100407 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-config-data" (OuterVolumeSpecName: "config-data") pod "f7f3da1b-cb51-4235-8d61-d44ba069528c" (UID: "f7f3da1b-cb51-4235-8d61-d44ba069528c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.101632 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h94hg" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.104240 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h94hg" event={"ID":"f7f3da1b-cb51-4235-8d61-d44ba069528c","Type":"ContainerDied","Data":"7b6dc1e6b68cd7b785b8f0b42d11a88ccd93526c6696cc6ba4f29cd519d896d8"} Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.104300 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b6dc1e6b68cd7b785b8f0b42d11a88ccd93526c6696cc6ba4f29cd519d896d8" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.106007 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78vl4\" (UniqueName: \"kubernetes.io/projected/a81e036d-5879-4813-bfda-9a203246b1e3-kube-api-access-78vl4\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.106022 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm6gh\" (UniqueName: \"kubernetes.io/projected/f7f3da1b-cb51-4235-8d61-d44ba069528c-kube-api-access-xm6gh\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.106031 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.106040 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.106048 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.728814 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7db9cf967f-jqqzk"] Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.807008 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-79m6p" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.810246 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.926191 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-combined-ca-bundle\") pod \"3d551533-7396-4941-a62c-b1a0039f6ddc\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.926569 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-db-sync-config-data\") pod \"3d551533-7396-4941-a62c-b1a0039f6ddc\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.926713 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tzkb\" (UniqueName: \"kubernetes.io/projected/3d551533-7396-4941-a62c-b1a0039f6ddc-kube-api-access-2tzkb\") pod \"3d551533-7396-4941-a62c-b1a0039f6ddc\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.936638 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3d551533-7396-4941-a62c-b1a0039f6ddc" (UID: "3d551533-7396-4941-a62c-b1a0039f6ddc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.952568 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d551533-7396-4941-a62c-b1a0039f6ddc-kube-api-access-2tzkb" (OuterVolumeSpecName: "kube-api-access-2tzkb") pod "3d551533-7396-4941-a62c-b1a0039f6ddc" (UID: "3d551533-7396-4941-a62c-b1a0039f6ddc"). InnerVolumeSpecName "kube-api-access-2tzkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.006632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d551533-7396-4941-a62c-b1a0039f6ddc" (UID: "3d551533-7396-4941-a62c-b1a0039f6ddc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.011534 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535608-fsxp2"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.029158 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.029193 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.029202 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tzkb\" (UniqueName: \"kubernetes.io/projected/3d551533-7396-4941-a62c-b1a0039f6ddc-kube-api-access-2tzkb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.035164 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535608-fsxp2"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.273797 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f7c080-b1b3-4173-8cad-c6d58715daf2" path="/var/lib/kubelet/pods/d8f7c080-b1b3-4173-8cad-c6d58715daf2/volumes" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.274743 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-866c89845b-gpgsw"] Feb 26 20:14:26 crc kubenswrapper[4722]: E0226 20:14:26.275048 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81e036d-5879-4813-bfda-9a203246b1e3" containerName="oc" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.275066 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81e036d-5879-4813-bfda-9a203246b1e3" containerName="oc" Feb 26 20:14:26 crc kubenswrapper[4722]: E0226 20:14:26.275077 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d551533-7396-4941-a62c-b1a0039f6ddc" containerName="barbican-db-sync" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.275084 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d551533-7396-4941-a62c-b1a0039f6ddc" containerName="barbican-db-sync" Feb 26 20:14:26 crc kubenswrapper[4722]: E0226 20:14:26.275102 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f3da1b-cb51-4235-8d61-d44ba069528c" containerName="placement-db-sync" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.275108 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f3da1b-cb51-4235-8d61-d44ba069528c" containerName="placement-db-sync" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.283998 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81e036d-5879-4813-bfda-9a203246b1e3" containerName="oc" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.284065 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d551533-7396-4941-a62c-b1a0039f6ddc" containerName="barbican-db-sync" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.284078 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7f3da1b-cb51-4235-8d61-d44ba069528c" containerName="placement-db-sync" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.285217 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-866c89845b-gpgsw"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.285308 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.308858 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7c8844bc6c-vsnhr"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.310610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.325834 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dflrm" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.326053 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.326284 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.326384 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.326428 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.326468 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.332854 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c8844bc6c-vsnhr"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.350990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-79m6p" event={"ID":"3d551533-7396-4941-a62c-b1a0039f6ddc","Type":"ContainerDied","Data":"24c214dccf78e5179e296050dc2ec03f2a234498cc90e5e556bcc1533a2a20b5"} Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.351521 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24c214dccf78e5179e296050dc2ec03f2a234498cc90e5e556bcc1533a2a20b5" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.351610 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-79m6p" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.364096 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-78948b6746-t9s8h"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.365693 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462","Type":"ContainerStarted","Data":"6753a3e2e289cf2a9e848d19931c5cf9300f728691e80555a5b2c7595e67c83c"} Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.365820 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.368018 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372331 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-config-data-custom\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372384 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-public-tls-certs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372416 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-internal-tls-certs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372435 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-config-data\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372460 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j46f\" (UniqueName: \"kubernetes.io/projected/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-kube-api-access-6j46f\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372498 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-config-data\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372533 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-combined-ca-bundle\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372555 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eba88113-0067-4ac3-873a-36e97ce5ef3b-logs\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372579 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-scripts\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdwm\" (UniqueName: \"kubernetes.io/projected/eba88113-0067-4ac3-873a-36e97ce5ef3b-kube-api-access-wrdwm\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-combined-ca-bundle\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372680 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-logs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.409827 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7db9cf967f-jqqzk" event={"ID":"783243ef-530a-418a-98b7-9f781077e95a","Type":"ContainerStarted","Data":"479ad88dbbe5b28f2c849f1afa3350f97df6f8e786791c234ae17ea47a2fcef2"} Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.420752 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78948b6746-t9s8h"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.472365 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-z66hr"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.472575 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="dnsmasq-dns" containerID="cri-o://794e81edf7664e789281221c615dfae434fe4a80f1535a733f3fd9cfbecf2274" gracePeriod=10 Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.473989 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-combined-ca-bundle\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-config-data\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474065 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdwm\" (UniqueName: \"kubernetes.io/projected/eba88113-0067-4ac3-873a-36e97ce5ef3b-kube-api-access-wrdwm\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-combined-ca-bundle\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474248 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-logs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474315 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7lt\" (UniqueName: \"kubernetes.io/projected/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-kube-api-access-vl7lt\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474347 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-config-data-custom\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-public-tls-certs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474451 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-logs\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474504 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-internal-tls-certs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-config-data\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-config-data-custom\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474612 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j46f\" (UniqueName: \"kubernetes.io/projected/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-kube-api-access-6j46f\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474716 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-config-data\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474782 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-combined-ca-bundle\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474824 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eba88113-0067-4ac3-873a-36e97ce5ef3b-logs\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474867 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-scripts\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.476770 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-logs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.477398 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.479628 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eba88113-0067-4ac3-873a-36e97ce5ef3b-logs\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.524453 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f846463-6d0b-474c-bb69-05430903325e","Type":"ContainerStarted","Data":"766644689bb0aa81f0df6f878248035eac6d0d2e74677ba725abe6d4b951b569"} Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.528197 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-scripts\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.528924 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-combined-ca-bundle\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.528977 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-internal-tls-certs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.529450 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-combined-ca-bundle\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.529789 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-public-tls-certs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.530110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j46f\" (UniqueName: \"kubernetes.io/projected/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-kube-api-access-6j46f\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.530186 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-config-data-custom\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.530743 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-config-data\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.534013 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-config-data\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.559499 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-j7flx"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.576318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-logs\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.576378 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-config-data-custom\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.576460 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-combined-ca-bundle\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.576499 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-config-data\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.576566 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7lt\" (UniqueName: \"kubernetes.io/projected/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-kube-api-access-vl7lt\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.580749 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-config-data-custom\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.580980 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-logs\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.583841 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.584847 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-config-data\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.598703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdwm\" (UniqueName: \"kubernetes.io/projected/eba88113-0067-4ac3-873a-36e97ce5ef3b-kube-api-access-wrdwm\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.600043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7lt\" (UniqueName: \"kubernetes.io/projected/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-kube-api-access-vl7lt\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.612802 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-combined-ca-bundle\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.638968 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-j7flx"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.677241 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-69b5cf9c6b-jmpww"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.678730 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.680188 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gsqn\" (UniqueName: \"kubernetes.io/projected/38407a6b-b816-4be9-9005-403940ea34c9-kube-api-access-7gsqn\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.680473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-config\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.680547 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.680569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.680596 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.680623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-svc\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.681230 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.702331 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.704802 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.715179 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69b5cf9c6b-jmpww"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.750700 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789361 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789383 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data-custom\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789427 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789463 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2jjm\" (UniqueName: \"kubernetes.io/projected/c916c2e2-18cb-4b79-ae01-4c977da93866-kube-api-access-k2jjm\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789546 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-svc\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-combined-ca-bundle\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916c2e2-18cb-4b79-ae01-4c977da93866-logs\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789733 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gsqn\" (UniqueName: \"kubernetes.io/projected/38407a6b-b816-4be9-9005-403940ea34c9-kube-api-access-7gsqn\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789781 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-config\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.790683 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-config\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.796063 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.796125 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-svc\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.796617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.815973 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.819758 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gsqn\" (UniqueName: \"kubernetes.io/projected/38407a6b-b816-4be9-9005-403940ea34c9-kube-api-access-7gsqn\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.896333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.896374 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data-custom\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.896413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2jjm\" (UniqueName: \"kubernetes.io/projected/c916c2e2-18cb-4b79-ae01-4c977da93866-kube-api-access-k2jjm\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.896476 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-combined-ca-bundle\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.896542 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916c2e2-18cb-4b79-ae01-4c977da93866-logs\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.899753 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916c2e2-18cb-4b79-ae01-4c977da93866-logs\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.905282 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.905822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-combined-ca-bundle\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.907804 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data-custom\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.932736 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2jjm\" (UniqueName: \"kubernetes.io/projected/c916c2e2-18cb-4b79-ae01-4c977da93866-kube-api-access-k2jjm\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.065541 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.077589 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.611535 4722 generic.go:334] "Generic (PLEG): container finished" podID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerID="794e81edf7664e789281221c615dfae434fe4a80f1535a733f3fd9cfbecf2274" exitCode=0 Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.611749 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" event={"ID":"50752c02-e94a-4695-b201-5acd8e4fd7b9","Type":"ContainerDied","Data":"794e81edf7664e789281221c615dfae434fe4a80f1535a733f3fd9cfbecf2274"} Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.637880 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m2kjh" event={"ID":"0f37d21c-75cb-471a-b68c-db4207ba0f6b","Type":"ContainerStarted","Data":"bb592692393f0930d4b3123281dbae19fb33d8273e3cf449cd3e968ed73d4454"} Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.657788 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-m2kjh" podStartSLOduration=4.175871565 podStartE2EDuration="50.657772187s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="2026-02-26 20:13:39.184013186 +0000 UTC m=+1161.720981110" lastFinishedPulling="2026-02-26 20:14:25.665913808 +0000 UTC m=+1208.202881732" observedRunningTime="2026-02-26 20:14:27.656876113 +0000 UTC m=+1210.193844037" watchObservedRunningTime="2026-02-26 20:14:27.657772187 +0000 UTC m=+1210.194740111" Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.658205 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f846463-6d0b-474c-bb69-05430903325e","Type":"ContainerStarted","Data":"7b3465ddec616604602a4e6530f42dfcc4f365abb64e8c419f69a657a16647f2"} Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.686331 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7db9cf967f-jqqzk" event={"ID":"783243ef-530a-418a-98b7-9f781077e95a","Type":"ContainerStarted","Data":"629610c2a790a45fc41c19f1e5df6e64872fcee44be5c02ac5d7ea742b4ac0f1"} Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.686447 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.686427973 podStartE2EDuration="4.686427973s" podCreationTimestamp="2026-02-26 20:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:27.680452051 +0000 UTC m=+1210.217419975" watchObservedRunningTime="2026-02-26 20:14:27.686427973 +0000 UTC m=+1210.223395907" Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.687405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.789600 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7db9cf967f-jqqzk" podStartSLOduration=4.789579387 podStartE2EDuration="4.789579387s" podCreationTimestamp="2026-02-26 20:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:27.719710935 +0000 UTC m=+1210.256678859" watchObservedRunningTime="2026-02-26 20:14:27.789579387 +0000 UTC m=+1210.326547311" Feb 26 20:14:28 crc kubenswrapper[4722]: I0226 20:14:28.120621 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78948b6746-t9s8h"] Feb 26 20:14:28 crc kubenswrapper[4722]: I0226 20:14:28.148186 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-866c89845b-gpgsw"] Feb 26 20:14:28 crc kubenswrapper[4722]: I0226 20:14:28.174889 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c8844bc6c-vsnhr"] Feb 26 20:14:28 crc kubenswrapper[4722]: I0226 20:14:28.276278 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69b5cf9c6b-jmpww"] Feb 26 20:14:28 crc kubenswrapper[4722]: I0226 20:14:28.298641 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-j7flx"] Feb 26 20:14:28 crc kubenswrapper[4722]: I0226 20:14:28.699319 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462","Type":"ContainerStarted","Data":"e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf"} Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.178565 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-695d67b888-54s74"] Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.181050 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.193449 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.213458 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.242803 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-695d67b888-54s74"] Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb79c8d8-0608-427d-9757-0186e5ebc504-logs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321316 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-config-data-custom\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321354 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-config-data\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321398 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-combined-ca-bundle\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321483 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-internal-tls-certs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321514 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tmlt\" (UniqueName: \"kubernetes.io/projected/eb79c8d8-0608-427d-9757-0186e5ebc504-kube-api-access-6tmlt\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321537 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-public-tls-certs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.423588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-config-data-custom\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.423907 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-config-data\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.423946 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-combined-ca-bundle\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.424014 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-internal-tls-certs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.424036 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tmlt\" (UniqueName: \"kubernetes.io/projected/eb79c8d8-0608-427d-9757-0186e5ebc504-kube-api-access-6tmlt\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.424054 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-public-tls-certs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.424150 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb79c8d8-0608-427d-9757-0186e5ebc504-logs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.425165 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb79c8d8-0608-427d-9757-0186e5ebc504-logs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.430355 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-internal-tls-certs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.433570 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-public-tls-certs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.444658 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-config-data-custom\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.455098 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-combined-ca-bundle\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.472261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tmlt\" (UniqueName: \"kubernetes.io/projected/eb79c8d8-0608-427d-9757-0186e5ebc504-kube-api-access-6tmlt\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.488686 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-config-data\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.563413 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:32 crc kubenswrapper[4722]: I0226 20:14:32.752740 4722 generic.go:334] "Generic (PLEG): container finished" podID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" containerID="bb592692393f0930d4b3123281dbae19fb33d8273e3cf449cd3e968ed73d4454" exitCode=0 Feb 26 20:14:32 crc kubenswrapper[4722]: I0226 20:14:32.752844 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m2kjh" event={"ID":"0f37d21c-75cb-471a-b68c-db4207ba0f6b","Type":"ContainerDied","Data":"bb592692393f0930d4b3123281dbae19fb33d8273e3cf449cd3e968ed73d4454"} Feb 26 20:14:33 crc kubenswrapper[4722]: W0226 20:14:33.307644 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc01eeff5_0acc_4fd4_9097_9b3e8a888ccd.slice/crio-685c25c3ad26656bb12f36a95366f0cc580be592e12a7647faab114d6b6c87df WatchSource:0}: Error finding container 685c25c3ad26656bb12f36a95366f0cc580be592e12a7647faab114d6b6c87df: Status 404 returned error can't find the container with id 685c25c3ad26656bb12f36a95366f0cc580be592e12a7647faab114d6b6c87df Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.680884 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.681601 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.715529 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.758179 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.793997 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-866c89845b-gpgsw" event={"ID":"fee2bbcc-fdd9-440d-8f6f-66206142c2f8","Type":"ContainerStarted","Data":"d26fdada9a8387f71f00ec36fd91fabac4966d3286b6dfedd477c869c61fbc3a"} Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.816487 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8844bc6c-vsnhr" event={"ID":"eba88113-0067-4ac3-873a-36e97ce5ef3b","Type":"ContainerStarted","Data":"98297f22c4a3503843dbba7e292d41cc31258a2e5f1a6561f9463afce5ac877d"} Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.817405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.823753 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-svc\") pod \"50752c02-e94a-4695-b201-5acd8e4fd7b9\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.823991 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-sb\") pod \"50752c02-e94a-4695-b201-5acd8e4fd7b9\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824080 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-config\") pod \"50752c02-e94a-4695-b201-5acd8e4fd7b9\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824121 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-swift-storage-0\") pod \"50752c02-e94a-4695-b201-5acd8e4fd7b9\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824189 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-nb\") pod \"50752c02-e94a-4695-b201-5acd8e4fd7b9\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824207 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gxxf\" (UniqueName: \"kubernetes.io/projected/50752c02-e94a-4695-b201-5acd8e4fd7b9-kube-api-access-6gxxf\") pod \"50752c02-e94a-4695-b201-5acd8e4fd7b9\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824468 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" event={"ID":"50752c02-e94a-4695-b201-5acd8e4fd7b9","Type":"ContainerDied","Data":"e9f3cf3dd8ff0a42728cf74bba57287fbde3911334c60b0bdae4dd8003e33c2b"} Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824520 4722 scope.go:117] "RemoveContainer" containerID="794e81edf7664e789281221c615dfae434fe4a80f1535a733f3fd9cfbecf2274" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824695 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.832256 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b5cf9c6b-jmpww" event={"ID":"c916c2e2-18cb-4b79-ae01-4c977da93866","Type":"ContainerStarted","Data":"c9990e01502993d45ebaff45cdf841da6914f82b8c7a3bcc5eaaf53c3ae7492d"} Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.836872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" event={"ID":"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd","Type":"ContainerStarted","Data":"685c25c3ad26656bb12f36a95366f0cc580be592e12a7647faab114d6b6c87df"} Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.852233 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" event={"ID":"38407a6b-b816-4be9-9005-403940ea34c9","Type":"ContainerStarted","Data":"ec1dce0600d42b9c5701750727602a3ceb0b54417521ccebed8d65c921f5194c"} Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.852274 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.852285 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.853298 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50752c02-e94a-4695-b201-5acd8e4fd7b9-kube-api-access-6gxxf" (OuterVolumeSpecName: "kube-api-access-6gxxf") pod "50752c02-e94a-4695-b201-5acd8e4fd7b9" (UID: "50752c02-e94a-4695-b201-5acd8e4fd7b9"). InnerVolumeSpecName "kube-api-access-6gxxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.896875 4722 scope.go:117] "RemoveContainer" containerID="e9b196481b1b215f1ce842f3ea62a0f77fec1a30a4e49418cef5f3cfc41f3fd6" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.904642 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-695d67b888-54s74"] Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.926035 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gxxf\" (UniqueName: \"kubernetes.io/projected/50752c02-e94a-4695-b201-5acd8e4fd7b9-kube-api-access-6gxxf\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.563412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-config" (OuterVolumeSpecName: "config") pod "50752c02-e94a-4695-b201-5acd8e4fd7b9" (UID: "50752c02-e94a-4695-b201-5acd8e4fd7b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.567753 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50752c02-e94a-4695-b201-5acd8e4fd7b9" (UID: "50752c02-e94a-4695-b201-5acd8e4fd7b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.609684 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "50752c02-e94a-4695-b201-5acd8e4fd7b9" (UID: "50752c02-e94a-4695-b201-5acd8e4fd7b9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.616680 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "50752c02-e94a-4695-b201-5acd8e4fd7b9" (UID: "50752c02-e94a-4695-b201-5acd8e4fd7b9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.617620 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50752c02-e94a-4695-b201-5acd8e4fd7b9" (UID: "50752c02-e94a-4695-b201-5acd8e4fd7b9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.647567 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.647603 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.647616 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.647626 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.647636 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.732520 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.750832 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f37d21c-75cb-471a-b68c-db4207ba0f6b-etc-machine-id\") pod \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.750921 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-scripts\") pod \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.750945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-config-data\") pod \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.751031 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-db-sync-config-data\") pod \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.751061 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f37d21c-75cb-471a-b68c-db4207ba0f6b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0f37d21c-75cb-471a-b68c-db4207ba0f6b" (UID: "0f37d21c-75cb-471a-b68c-db4207ba0f6b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.751099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-combined-ca-bundle\") pod \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.751167 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd5tb\" (UniqueName: \"kubernetes.io/projected/0f37d21c-75cb-471a-b68c-db4207ba0f6b-kube-api-access-pd5tb\") pod \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.751591 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f37d21c-75cb-471a-b68c-db4207ba0f6b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.764408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0f37d21c-75cb-471a-b68c-db4207ba0f6b" (UID: "0f37d21c-75cb-471a-b68c-db4207ba0f6b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.796552 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f37d21c-75cb-471a-b68c-db4207ba0f6b-kube-api-access-pd5tb" (OuterVolumeSpecName: "kube-api-access-pd5tb") pod "0f37d21c-75cb-471a-b68c-db4207ba0f6b" (UID: "0f37d21c-75cb-471a-b68c-db4207ba0f6b"). InnerVolumeSpecName "kube-api-access-pd5tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.804013 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-scripts" (OuterVolumeSpecName: "scripts") pod "0f37d21c-75cb-471a-b68c-db4207ba0f6b" (UID: "0f37d21c-75cb-471a-b68c-db4207ba0f6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.813743 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-z66hr"] Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.825911 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-z66hr"] Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.853257 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.853542 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.853552 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd5tb\" (UniqueName: \"kubernetes.io/projected/0f37d21c-75cb-471a-b68c-db4207ba0f6b-kube-api-access-pd5tb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.871336 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-695d67b888-54s74" event={"ID":"eb79c8d8-0608-427d-9757-0186e5ebc504","Type":"ContainerStarted","Data":"d0cf552f53d064f3f1181852e6355946359dc4644e9b0c8858601a63abe3fca2"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.875855 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-866c89845b-gpgsw" event={"ID":"fee2bbcc-fdd9-440d-8f6f-66206142c2f8","Type":"ContainerStarted","Data":"e6be2ee7780a9aca576fc8dbb4753b44e4a418e166bdac7c5d9f7e6f3ef80952"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.877604 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462","Type":"ContainerStarted","Data":"7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.887492 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerStarted","Data":"10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.895742 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.895804 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m2kjh" event={"ID":"0f37d21c-75cb-471a-b68c-db4207ba0f6b","Type":"ContainerDied","Data":"fab44ccf12f07bc37cdac5fc33a8e02c284c3c84e1db6271b013092a599849ce"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.895839 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fab44ccf12f07bc37cdac5fc33a8e02c284c3c84e1db6271b013092a599849ce" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.902005 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.901985384 podStartE2EDuration="11.901985384s" podCreationTimestamp="2026-02-26 20:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:34.90074714 +0000 UTC m=+1217.437715074" watchObservedRunningTime="2026-02-26 20:14:34.901985384 +0000 UTC m=+1217.438953308" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.906624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b5cf9c6b-jmpww" event={"ID":"c916c2e2-18cb-4b79-ae01-4c977da93866","Type":"ContainerStarted","Data":"58e54c8749ba66b68213d7acc2fbd7148660e6f41229809b23505c309a5a7f2d"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.916839 4722 generic.go:334] "Generic (PLEG): container finished" podID="38407a6b-b816-4be9-9005-403940ea34c9" containerID="76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507" exitCode=0 Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.916976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" event={"ID":"38407a6b-b816-4be9-9005-403940ea34c9","Type":"ContainerDied","Data":"76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.944077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-9bqd7" event={"ID":"04f47952-580e-40b8-80f0-25d1bf8ccc22","Type":"ContainerStarted","Data":"6f2e937ad24a94d5ba509309725af0a7a53c1461974805f8d6a685fb89a60bed"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.946085 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f37d21c-75cb-471a-b68c-db4207ba0f6b" (UID: "0f37d21c-75cb-471a-b68c-db4207ba0f6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.958775 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.986934 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-9bqd7" podStartSLOduration=3.9913785969999998 podStartE2EDuration="57.986911723s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="2026-02-26 20:13:39.531196729 +0000 UTC m=+1162.068164653" lastFinishedPulling="2026-02-26 20:14:33.526729865 +0000 UTC m=+1216.063697779" observedRunningTime="2026-02-26 20:14:34.968257179 +0000 UTC m=+1217.505225113" watchObservedRunningTime="2026-02-26 20:14:34.986911723 +0000 UTC m=+1217.523879647" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.149364 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:35 crc kubenswrapper[4722]: E0226 20:14:35.149972 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="dnsmasq-dns" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.149989 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="dnsmasq-dns" Feb 26 20:14:35 crc kubenswrapper[4722]: E0226 20:14:35.150012 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="init" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.150018 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="init" Feb 26 20:14:35 crc kubenswrapper[4722]: E0226 20:14:35.150041 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" containerName="cinder-db-sync" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.150239 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" containerName="cinder-db-sync" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.167395 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="dnsmasq-dns" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.167462 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" containerName="cinder-db-sync" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.168550 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.168622 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.191711 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.264006 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-j7flx"] Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.277731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-config-data" (OuterVolumeSpecName: "config-data") pod "0f37d21c-75cb-471a-b68c-db4207ba0f6b" (UID: "0f37d21c-75cb-471a-b68c-db4207ba0f6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.290796 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-fdfqf"] Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.292603 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.307438 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-fdfqf"] Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.362958 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.363049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.363152 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.363189 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-scripts\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.363239 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8p7g\" (UniqueName: \"kubernetes.io/projected/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-kube-api-access-r8p7g\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.363333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.363520 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.383224 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.389410 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.410523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.428866 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.464895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.464981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-config\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465063 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465175 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465202 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfjx9\" (UniqueName: \"kubernetes.io/projected/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-kube-api-access-tfjx9\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465229 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-scripts\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465261 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8p7g\" (UniqueName: \"kubernetes.io/projected/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-kube-api-access-r8p7g\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465282 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465307 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.466385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.469934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.471544 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-scripts\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.473158 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.483682 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.485757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8p7g\" (UniqueName: \"kubernetes.io/projected/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-kube-api-access-r8p7g\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.532782 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566603 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566656 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-config\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566734 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566749 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-scripts\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566768 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2667d371-c443-4205-90cd-420ef3d0b62d-logs\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566790 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data-custom\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566811 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78hs8\" (UniqueName: \"kubernetes.io/projected/2667d371-c443-4205-90cd-420ef3d0b62d-kube-api-access-78hs8\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566835 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566855 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2667d371-c443-4205-90cd-420ef3d0b62d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566905 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfjx9\" (UniqueName: \"kubernetes.io/projected/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-kube-api-access-tfjx9\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.567716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.568666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.569230 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.569432 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-config\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.569973 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.598644 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfjx9\" (UniqueName: \"kubernetes.io/projected/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-kube-api-access-tfjx9\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.669387 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.669491 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.669513 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-scripts\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.669988 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2667d371-c443-4205-90cd-420ef3d0b62d-logs\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.670026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data-custom\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.670060 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78hs8\" (UniqueName: \"kubernetes.io/projected/2667d371-c443-4205-90cd-420ef3d0b62d-kube-api-access-78hs8\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.670100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2667d371-c443-4205-90cd-420ef3d0b62d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.670226 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2667d371-c443-4205-90cd-420ef3d0b62d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.670636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2667d371-c443-4205-90cd-420ef3d0b62d-logs\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.674024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.677265 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.679188 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data-custom\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.682637 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-scripts\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.685424 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.702677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78hs8\" (UniqueName: \"kubernetes.io/projected/2667d371-c443-4205-90cd-420ef3d0b62d-kube-api-access-78hs8\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.743895 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.010532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-695d67b888-54s74" event={"ID":"eb79c8d8-0608-427d-9757-0186e5ebc504","Type":"ContainerStarted","Data":"787c85ae09ffcf1cfea6c9cf7361491c4f8ad522a3cc3507606ad8a00c73addf"} Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.016801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b5cf9c6b-jmpww" event={"ID":"c916c2e2-18cb-4b79-ae01-4c977da93866","Type":"ContainerStarted","Data":"cea4826330b14d6f2739ac5e94f26ef917a6952f710ce432d23718b17afe25be"} Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.016895 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.016906 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.017391 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.017420 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.049667 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podStartSLOduration=10.049650788 podStartE2EDuration="10.049650788s" podCreationTimestamp="2026-02-26 20:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:36.047597562 +0000 UTC m=+1218.584565496" watchObservedRunningTime="2026-02-26 20:14:36.049650788 +0000 UTC m=+1218.586618712" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.157712 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" path="/var/lib/kubelet/pods/50752c02-e94a-4695-b201-5acd8e4fd7b9/volumes" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.455901 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: i/o timeout" Feb 26 20:14:37 crc kubenswrapper[4722]: I0226 20:14:37.461576 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:14:37 crc kubenswrapper[4722]: I0226 20:14:37.598128 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:14:37 crc kubenswrapper[4722]: I0226 20:14:37.696354 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:37 crc kubenswrapper[4722]: W0226 20:14:37.718902 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78cc33bd_e962_4121_8a5d_0e75ba60fdf3.slice/crio-51de805b03b6b790488133042bda084ceb1673b8cd1e17e5c0711c710d6fed17 WatchSource:0}: Error finding container 51de805b03b6b790488133042bda084ceb1673b8cd1e17e5c0711c710d6fed17: Status 404 returned error can't find the container with id 51de805b03b6b790488133042bda084ceb1673b8cd1e17e5c0711c710d6fed17 Feb 26 20:14:37 crc kubenswrapper[4722]: I0226 20:14:37.733948 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-fdfqf"] Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.032189 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.032284 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.089776 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-695d67b888-54s74" event={"ID":"eb79c8d8-0608-427d-9757-0186e5ebc504","Type":"ContainerStarted","Data":"8868e95e356607f9d2a8d04cc475cf73f0cfdd7c9d519402219acdb7a3b819a5"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.090385 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.093576 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" event={"ID":"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd","Type":"ContainerStarted","Data":"cdacbced53405bef1ed7a025d33e1efdb5013e6b2f40c1f631b9b75a4ccadcad"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.093605 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" event={"ID":"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd","Type":"ContainerStarted","Data":"94367d7ee45bbb330d18ee6782858df250a46f8a8345496459350091b2f9bb84"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.112260 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-695d67b888-54s74" podStartSLOduration=9.112239812 podStartE2EDuration="9.112239812s" podCreationTimestamp="2026-02-26 20:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:38.11034385 +0000 UTC m=+1220.647311774" watchObservedRunningTime="2026-02-26 20:14:38.112239812 +0000 UTC m=+1220.649207746" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.129577 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" podUID="38407a6b-b816-4be9-9005-403940ea34c9" containerName="dnsmasq-dns" containerID="cri-o://d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3" gracePeriod=10 Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.129809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" event={"ID":"38407a6b-b816-4be9-9005-403940ea34c9","Type":"ContainerStarted","Data":"d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.130015 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.134355 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" event={"ID":"78cc33bd-e962-4121-8a5d-0e75ba60fdf3","Type":"ContainerStarted","Data":"557863aadae5dfcfa5811e7da70cad25f46690ab2c721d603384f2b1764310bf"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.134602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" event={"ID":"78cc33bd-e962-4121-8a5d-0e75ba60fdf3","Type":"ContainerStarted","Data":"51de805b03b6b790488133042bda084ceb1673b8cd1e17e5c0711c710d6fed17"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.204781 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-866c89845b-gpgsw" event={"ID":"fee2bbcc-fdd9-440d-8f6f-66206142c2f8","Type":"ContainerStarted","Data":"46d9babfb0f0f724270c82ceac725ea347c81f596da212ac3ae6ea0fa16bc700"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.205092 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.205104 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.205112 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4de1f9bc-aa69-4351-a9c9-44f7b59deaea","Type":"ContainerStarted","Data":"3aac2365a2468889ade96c83ec75fcf98015fb0ef49093cd684c51ef45021eff"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.207101 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8844bc6c-vsnhr" event={"ID":"eba88113-0067-4ac3-873a-36e97ce5ef3b","Type":"ContainerStarted","Data":"bb3f8c629cb14b51fda841bf1d20c86df04351527d2f5f68aff897d33d1e6339"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.207127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8844bc6c-vsnhr" event={"ID":"eba88113-0067-4ac3-873a-36e97ce5ef3b","Type":"ContainerStarted","Data":"6ff2756c69e9100f581333b4f44721697511210b398b4073c9766ad5aaf6d629"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.211731 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" podStartSLOduration=8.581457833 podStartE2EDuration="12.211717687s" podCreationTimestamp="2026-02-26 20:14:26 +0000 UTC" firstStartedPulling="2026-02-26 20:14:33.312700679 +0000 UTC m=+1215.849668593" lastFinishedPulling="2026-02-26 20:14:36.942960533 +0000 UTC m=+1219.479928447" observedRunningTime="2026-02-26 20:14:38.159593645 +0000 UTC m=+1220.696561579" watchObservedRunningTime="2026-02-26 20:14:38.211717687 +0000 UTC m=+1220.748685611" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.219484 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2667d371-c443-4205-90cd-420ef3d0b62d","Type":"ContainerStarted","Data":"7cb1f32eaa85ba614f78146f41f866dca1258348fa2a0d73dce20b9e35fed675"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.260925 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" podStartSLOduration=12.260908578 podStartE2EDuration="12.260908578s" podCreationTimestamp="2026-02-26 20:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:38.233567938 +0000 UTC m=+1220.770535862" watchObservedRunningTime="2026-02-26 20:14:38.260908578 +0000 UTC m=+1220.797876502" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.298613 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.330071 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-866c89845b-gpgsw" podStartSLOduration=12.330041601 podStartE2EDuration="12.330041601s" podCreationTimestamp="2026-02-26 20:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:38.316803932 +0000 UTC m=+1220.853771876" watchObservedRunningTime="2026-02-26 20:14:38.330041601 +0000 UTC m=+1220.867009525" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.348945 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7c8844bc6c-vsnhr" podStartSLOduration=8.729970265 podStartE2EDuration="12.348927483s" podCreationTimestamp="2026-02-26 20:14:26 +0000 UTC" firstStartedPulling="2026-02-26 20:14:33.31350512 +0000 UTC m=+1215.850473044" lastFinishedPulling="2026-02-26 20:14:36.932462338 +0000 UTC m=+1219.469430262" observedRunningTime="2026-02-26 20:14:38.333254558 +0000 UTC m=+1220.870222482" watchObservedRunningTime="2026-02-26 20:14:38.348927483 +0000 UTC m=+1220.885895407" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.092067 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.186811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gsqn\" (UniqueName: \"kubernetes.io/projected/38407a6b-b816-4be9-9005-403940ea34c9-kube-api-access-7gsqn\") pod \"38407a6b-b816-4be9-9005-403940ea34c9\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.186865 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-svc\") pod \"38407a6b-b816-4be9-9005-403940ea34c9\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.186976 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-sb\") pod \"38407a6b-b816-4be9-9005-403940ea34c9\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.187025 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-nb\") pod \"38407a6b-b816-4be9-9005-403940ea34c9\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.187082 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-config\") pod \"38407a6b-b816-4be9-9005-403940ea34c9\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.187162 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-swift-storage-0\") pod \"38407a6b-b816-4be9-9005-403940ea34c9\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.192721 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38407a6b-b816-4be9-9005-403940ea34c9-kube-api-access-7gsqn" (OuterVolumeSpecName: "kube-api-access-7gsqn") pod "38407a6b-b816-4be9-9005-403940ea34c9" (UID: "38407a6b-b816-4be9-9005-403940ea34c9"). InnerVolumeSpecName "kube-api-access-7gsqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.265161 4722 generic.go:334] "Generic (PLEG): container finished" podID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerID="557863aadae5dfcfa5811e7da70cad25f46690ab2c721d603384f2b1764310bf" exitCode=0 Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.265246 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" event={"ID":"78cc33bd-e962-4121-8a5d-0e75ba60fdf3","Type":"ContainerDied","Data":"557863aadae5dfcfa5811e7da70cad25f46690ab2c721d603384f2b1764310bf"} Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.265273 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" event={"ID":"78cc33bd-e962-4121-8a5d-0e75ba60fdf3","Type":"ContainerStarted","Data":"cddb49ca061ca66fd7aadb40dcd0c74ad46143b2e2ce35bcfc4f7f6eaebb9ac2"} Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.266194 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.268405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2667d371-c443-4205-90cd-420ef3d0b62d","Type":"ContainerStarted","Data":"ccdd14614f54d6a4870da57abe79788458e860727742682760868c41346dc0bb"} Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.284015 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38407a6b-b816-4be9-9005-403940ea34c9" (UID: "38407a6b-b816-4be9-9005-403940ea34c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.288908 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-config" (OuterVolumeSpecName: "config") pod "38407a6b-b816-4be9-9005-403940ea34c9" (UID: "38407a6b-b816-4be9-9005-403940ea34c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.289092 4722 generic.go:334] "Generic (PLEG): container finished" podID="38407a6b-b816-4be9-9005-403940ea34c9" containerID="d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3" exitCode=0 Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.289446 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.289528 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" event={"ID":"38407a6b-b816-4be9-9005-403940ea34c9","Type":"ContainerDied","Data":"d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3"} Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.289558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" event={"ID":"38407a6b-b816-4be9-9005-403940ea34c9","Type":"ContainerDied","Data":"ec1dce0600d42b9c5701750727602a3ceb0b54417521ccebed8d65c921f5194c"} Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.289574 4722 scope.go:117] "RemoveContainer" containerID="d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.290516 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.291103 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.291799 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.292197 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38407a6b-b816-4be9-9005-403940ea34c9" (UID: "38407a6b-b816-4be9-9005-403940ea34c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.295072 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gsqn\" (UniqueName: \"kubernetes.io/projected/38407a6b-b816-4be9-9005-403940ea34c9-kube-api-access-7gsqn\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.320214 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" podStartSLOduration=4.320192989 podStartE2EDuration="4.320192989s" podCreationTimestamp="2026-02-26 20:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:39.304748381 +0000 UTC m=+1221.841716325" watchObservedRunningTime="2026-02-26 20:14:39.320192989 +0000 UTC m=+1221.857160913" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.343537 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38407a6b-b816-4be9-9005-403940ea34c9" (UID: "38407a6b-b816-4be9-9005-403940ea34c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.344011 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "38407a6b-b816-4be9-9005-403940ea34c9" (UID: "38407a6b-b816-4be9-9005-403940ea34c9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.396805 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.396835 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.396846 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.459892 4722 scope.go:117] "RemoveContainer" containerID="76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.530556 4722 scope.go:117] "RemoveContainer" containerID="d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3" Feb 26 20:14:39 crc kubenswrapper[4722]: E0226 20:14:39.535484 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3\": container with ID starting with d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3 not found: ID does not exist" containerID="d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.535682 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3"} err="failed to get container status \"d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3\": rpc error: code = NotFound desc = could not find container \"d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3\": container with ID starting with d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3 not found: ID does not exist" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.535757 4722 scope.go:117] "RemoveContainer" containerID="76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507" Feb 26 20:14:39 crc kubenswrapper[4722]: E0226 20:14:39.536890 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507\": container with ID starting with 76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507 not found: ID does not exist" containerID="76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.536915 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507"} err="failed to get container status \"76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507\": rpc error: code = NotFound desc = could not find container \"76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507\": container with ID starting with 76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507 not found: ID does not exist" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.639603 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-j7flx"] Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.649167 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-j7flx"] Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.159993 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38407a6b-b816-4be9-9005-403940ea34c9" path="/var/lib/kubelet/pods/38407a6b-b816-4be9-9005-403940ea34c9/volumes" Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.313755 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4de1f9bc-aa69-4351-a9c9-44f7b59deaea","Type":"ContainerStarted","Data":"e362f4578c0cf105528b91e15abc8fa364be316e709acfdbb439f43e665d6510"} Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.323242 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2667d371-c443-4205-90cd-420ef3d0b62d","Type":"ContainerStarted","Data":"7dd91e6baea5b819701667e8f65c7a6b6b3a6556bce4d0b818d931cbc05dbf34"} Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.323459 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api-log" containerID="cri-o://ccdd14614f54d6a4870da57abe79788458e860727742682760868c41346dc0bb" gracePeriod=30 Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.323999 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api" containerID="cri-o://7dd91e6baea5b819701667e8f65c7a6b6b3a6556bce4d0b818d931cbc05dbf34" gracePeriod=30 Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.324442 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.356897 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.356874217 podStartE2EDuration="5.356874217s" podCreationTimestamp="2026-02-26 20:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:40.345425217 +0000 UTC m=+1222.882393151" watchObservedRunningTime="2026-02-26 20:14:40.356874217 +0000 UTC m=+1222.893842151" Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.877595 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.337356 4722 generic.go:334] "Generic (PLEG): container finished" podID="04f47952-580e-40b8-80f0-25d1bf8ccc22" containerID="6f2e937ad24a94d5ba509309725af0a7a53c1461974805f8d6a685fb89a60bed" exitCode=0 Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.337379 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-9bqd7" event={"ID":"04f47952-580e-40b8-80f0-25d1bf8ccc22","Type":"ContainerDied","Data":"6f2e937ad24a94d5ba509309725af0a7a53c1461974805f8d6a685fb89a60bed"} Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.340659 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4de1f9bc-aa69-4351-a9c9-44f7b59deaea","Type":"ContainerStarted","Data":"1041c4b882dc07bf08dafd3a5b1d68304c4445f920fb0ee35ea020a1f8def607"} Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.343119 4722 generic.go:334] "Generic (PLEG): container finished" podID="2667d371-c443-4205-90cd-420ef3d0b62d" containerID="ccdd14614f54d6a4870da57abe79788458e860727742682760868c41346dc0bb" exitCode=143 Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.343205 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2667d371-c443-4205-90cd-420ef3d0b62d","Type":"ContainerDied","Data":"ccdd14614f54d6a4870da57abe79788458e860727742682760868c41346dc0bb"} Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.386623 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.348022068 podStartE2EDuration="6.386595897s" podCreationTimestamp="2026-02-26 20:14:35 +0000 UTC" firstStartedPulling="2026-02-26 20:14:37.736710441 +0000 UTC m=+1220.273678355" lastFinishedPulling="2026-02-26 20:14:38.77528426 +0000 UTC m=+1221.312252184" observedRunningTime="2026-02-26 20:14:41.378389654 +0000 UTC m=+1223.915357578" watchObservedRunningTime="2026-02-26 20:14:41.386595897 +0000 UTC m=+1223.923563831" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.581071 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.842921 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-647dc79bf7-sr259"] Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.843180 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-647dc79bf7-sr259" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-api" containerID="cri-o://ab508fb68b314fd1c841ead8a41612709fe3cda3d4dc611dccaf5dabae8c1777" gracePeriod=30 Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.843832 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-647dc79bf7-sr259" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-httpd" containerID="cri-o://fac8fedd4f876a15ec465e90d03935b71e6621e396916ae5147c867d4c9a484e" gracePeriod=30 Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.877173 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b6f7bc47c-7t9k4"] Feb 26 20:14:41 crc kubenswrapper[4722]: E0226 20:14:41.877579 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38407a6b-b816-4be9-9005-403940ea34c9" containerName="dnsmasq-dns" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.877592 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="38407a6b-b816-4be9-9005-403940ea34c9" containerName="dnsmasq-dns" Feb 26 20:14:41 crc kubenswrapper[4722]: E0226 20:14:41.877610 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38407a6b-b816-4be9-9005-403940ea34c9" containerName="init" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.877616 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="38407a6b-b816-4be9-9005-403940ea34c9" containerName="init" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.877806 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="38407a6b-b816-4be9-9005-403940ea34c9" containerName="dnsmasq-dns" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.878950 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.889696 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-647dc79bf7-sr259" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.174:9696/\": read tcp 10.217.0.2:59872->10.217.0.174:9696: read: connection reset by peer" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.895359 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b6f7bc47c-7t9k4"] Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950033 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-public-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950176 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-config\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950217 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tzch\" (UniqueName: \"kubernetes.io/projected/d3b8803c-74dc-4932-9bdc-d45ca70103c4-kube-api-access-5tzch\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950244 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-internal-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950265 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-httpd-config\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950291 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-ovndb-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950323 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-combined-ca-bundle\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.051908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-httpd-config\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.051975 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-ovndb-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.052025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-combined-ca-bundle\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.052074 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-public-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.052219 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-config\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.052269 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tzch\" (UniqueName: \"kubernetes.io/projected/d3b8803c-74dc-4932-9bdc-d45ca70103c4-kube-api-access-5tzch\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.052304 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-internal-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.059500 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-internal-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.060868 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-combined-ca-bundle\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.061036 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-config\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.061338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-ovndb-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.062981 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-public-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.063750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-httpd-config\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.074473 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tzch\" (UniqueName: \"kubernetes.io/projected/d3b8803c-74dc-4932-9bdc-d45ca70103c4-kube-api-access-5tzch\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.239497 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.314967 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.398283 4722 generic.go:334] "Generic (PLEG): container finished" podID="724a51e1-b819-4615-8626-f2d5e69e6798" containerID="fac8fedd4f876a15ec465e90d03935b71e6621e396916ae5147c867d4c9a484e" exitCode=0 Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.398606 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647dc79bf7-sr259" event={"ID":"724a51e1-b819-4615-8626-f2d5e69e6798","Type":"ContainerDied","Data":"fac8fedd4f876a15ec465e90d03935b71e6621e396916ae5147c867d4c9a484e"} Feb 26 20:14:43 crc kubenswrapper[4722]: I0226 20:14:43.631847 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-647dc79bf7-sr259" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.174:9696/\": dial tcp 10.217.0.174:9696: connect: connection refused" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.137702 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.182924 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.516859 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.594889 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69b5cf9c6b-jmpww"] Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.711102 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.711182 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.789418 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.812508 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.440177 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" containerID="cri-o://58e54c8749ba66b68213d7acc2fbd7148660e6f41229809b23505c309a5a7f2d" gracePeriod=30 Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.440704 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api" containerID="cri-o://cea4826330b14d6f2739ac5e94f26ef917a6952f710ce432d23718b17afe25be" gracePeriod=30 Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.441014 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.441065 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.446424 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.182:9311/healthcheck\": EOF" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.446457 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.182:9311/healthcheck\": EOF" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.446430 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.182:9311/healthcheck\": EOF" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.535686 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.688312 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.762589 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-lcmxp"] Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.762836 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="dnsmasq-dns" containerID="cri-o://8b090bf3aaebcb88b0c2a76597cf1496e6eac0069a0cf428a88c4a6c7ab51500" gracePeriod=10 Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.808177 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.402215 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.166:5353: connect: connection refused" Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.465582 4722 generic.go:334] "Generic (PLEG): container finished" podID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerID="8b090bf3aaebcb88b0c2a76597cf1496e6eac0069a0cf428a88c4a6c7ab51500" exitCode=0 Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.465682 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" event={"ID":"29b8dfbb-ff67-4a15-b078-0f7abe623431","Type":"ContainerDied","Data":"8b090bf3aaebcb88b0c2a76597cf1496e6eac0069a0cf428a88c4a6c7ab51500"} Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.468991 4722 generic.go:334] "Generic (PLEG): container finished" podID="724a51e1-b819-4615-8626-f2d5e69e6798" containerID="ab508fb68b314fd1c841ead8a41612709fe3cda3d4dc611dccaf5dabae8c1777" exitCode=0 Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.469047 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647dc79bf7-sr259" event={"ID":"724a51e1-b819-4615-8626-f2d5e69e6798","Type":"ContainerDied","Data":"ab508fb68b314fd1c841ead8a41612709fe3cda3d4dc611dccaf5dabae8c1777"} Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.479983 4722 generic.go:334] "Generic (PLEG): container finished" podID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerID="58e54c8749ba66b68213d7acc2fbd7148660e6f41229809b23505c309a5a7f2d" exitCode=143 Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.480040 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b5cf9c6b-jmpww" event={"ID":"c916c2e2-18cb-4b79-ae01-4c977da93866","Type":"ContainerDied","Data":"58e54c8749ba66b68213d7acc2fbd7148660e6f41229809b23505c309a5a7f2d"} Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.540170 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:47 crc kubenswrapper[4722]: I0226 20:14:47.489971 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="cinder-scheduler" containerID="cri-o://e362f4578c0cf105528b91e15abc8fa364be316e709acfdbb439f43e665d6510" gracePeriod=30 Feb 26 20:14:47 crc kubenswrapper[4722]: I0226 20:14:47.490069 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="probe" containerID="cri-o://1041c4b882dc07bf08dafd3a5b1d68304c4445f920fb0ee35ea020a1f8def607" gracePeriod=30 Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.108735 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.246503 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.246639 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.411020 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.516062 4722 generic.go:334] "Generic (PLEG): container finished" podID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerID="1041c4b882dc07bf08dafd3a5b1d68304c4445f920fb0ee35ea020a1f8def607" exitCode=0 Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.516097 4722 generic.go:334] "Generic (PLEG): container finished" podID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerID="e362f4578c0cf105528b91e15abc8fa364be316e709acfdbb439f43e665d6510" exitCode=0 Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.516362 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4de1f9bc-aa69-4351-a9c9-44f7b59deaea","Type":"ContainerDied","Data":"1041c4b882dc07bf08dafd3a5b1d68304c4445f920fb0ee35ea020a1f8def607"} Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.516401 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4de1f9bc-aa69-4351-a9c9-44f7b59deaea","Type":"ContainerDied","Data":"e362f4578c0cf105528b91e15abc8fa364be316e709acfdbb439f43e665d6510"} Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.372118 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.529744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-9bqd7" event={"ID":"04f47952-580e-40b8-80f0-25d1bf8ccc22","Type":"ContainerDied","Data":"b3b2d2e9303517af7c490ec7734224121942206b4d90753d5e60281ef874a9ba"} Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.529785 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b2d2e9303517af7c490ec7734224121942206b4d90753d5e60281ef874a9ba" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.530807 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.540700 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-config-data\") pod \"04f47952-580e-40b8-80f0-25d1bf8ccc22\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.540771 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-scripts\") pod \"04f47952-580e-40b8-80f0-25d1bf8ccc22\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.540952 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4g2q\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-kube-api-access-d4g2q\") pod \"04f47952-580e-40b8-80f0-25d1bf8ccc22\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.540998 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-combined-ca-bundle\") pod \"04f47952-580e-40b8-80f0-25d1bf8ccc22\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.541154 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-certs\") pod \"04f47952-580e-40b8-80f0-25d1bf8ccc22\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.552654 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-scripts" (OuterVolumeSpecName: "scripts") pod "04f47952-580e-40b8-80f0-25d1bf8ccc22" (UID: "04f47952-580e-40b8-80f0-25d1bf8ccc22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.558076 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-certs" (OuterVolumeSpecName: "certs") pod "04f47952-580e-40b8-80f0-25d1bf8ccc22" (UID: "04f47952-580e-40b8-80f0-25d1bf8ccc22"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.558501 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-kube-api-access-d4g2q" (OuterVolumeSpecName: "kube-api-access-d4g2q") pod "04f47952-580e-40b8-80f0-25d1bf8ccc22" (UID: "04f47952-580e-40b8-80f0-25d1bf8ccc22"). InnerVolumeSpecName "kube-api-access-d4g2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.583943 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04f47952-580e-40b8-80f0-25d1bf8ccc22" (UID: "04f47952-580e-40b8-80f0-25d1bf8ccc22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.584312 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-config-data" (OuterVolumeSpecName: "config-data") pod "04f47952-580e-40b8-80f0-25d1bf8ccc22" (UID: "04f47952-580e-40b8-80f0-25d1bf8ccc22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.643742 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.643774 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.643783 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.643793 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.643802 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4g2q\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-kube-api-access-d4g2q\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.551489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerStarted","Data":"209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852"} Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.552509 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-central-agent" containerID="cri-o://e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d" gracePeriod=30 Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.552604 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.552847 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="proxy-httpd" containerID="cri-o://209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852" gracePeriod=30 Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.552869 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="sg-core" containerID="cri-o://10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315" gracePeriod=30 Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.552909 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-notification-agent" containerID="cri-o://30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0" gracePeriod=30 Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.557486 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-f7nmr"] Feb 26 20:14:50 crc kubenswrapper[4722]: E0226 20:14:50.558185 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f47952-580e-40b8-80f0-25d1bf8ccc22" containerName="cloudkitty-db-sync" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.558273 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f47952-580e-40b8-80f0-25d1bf8ccc22" containerName="cloudkitty-db-sync" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.558543 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="04f47952-580e-40b8-80f0-25d1bf8ccc22" containerName="cloudkitty-db-sync" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.563438 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.568400 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.568612 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.568804 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-k7xwb" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.568898 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.569408 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.577871 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-f7nmr"] Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.600512 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.528993398 podStartE2EDuration="1m13.60049235s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="2026-02-26 20:13:40.083588441 +0000 UTC m=+1162.620556355" lastFinishedPulling="2026-02-26 20:14:50.155087383 +0000 UTC m=+1232.692055307" observedRunningTime="2026-02-26 20:14:50.585967335 +0000 UTC m=+1233.122935299" watchObservedRunningTime="2026-02-26 20:14:50.60049235 +0000 UTC m=+1233.137460274" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.689244 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlwms\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-kube-api-access-zlwms\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.689305 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-config-data\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.689387 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-scripts\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.689524 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-combined-ca-bundle\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.689616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-certs\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.791094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlwms\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-kube-api-access-zlwms\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.791162 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-config-data\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.791207 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-scripts\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.791301 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-combined-ca-bundle\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.791365 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-certs\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.800018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-scripts\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.800041 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-combined-ca-bundle\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.800980 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-config-data\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.807403 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlwms\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-kube-api-access-zlwms\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.814775 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-certs\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.899815 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.964383 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.182:9311/healthcheck\": read tcp 10.217.0.2:45590->10.217.0.182:9311: read: connection reset by peer" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.964476 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.182:9311/healthcheck\": read tcp 10.217.0.2:45594->10.217.0.182:9311: read: connection reset by peer" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.000322 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.051086 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.056996 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.106877 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-sb\") pod \"29b8dfbb-ff67-4a15-b078-0f7abe623431\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.106930 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-nb\") pod \"29b8dfbb-ff67-4a15-b078-0f7abe623431\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.107071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-swift-storage-0\") pod \"29b8dfbb-ff67-4a15-b078-0f7abe623431\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.107110 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6mb6\" (UniqueName: \"kubernetes.io/projected/29b8dfbb-ff67-4a15-b078-0f7abe623431-kube-api-access-w6mb6\") pod \"29b8dfbb-ff67-4a15-b078-0f7abe623431\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.107147 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-svc\") pod \"29b8dfbb-ff67-4a15-b078-0f7abe623431\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.107265 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-config\") pod \"29b8dfbb-ff67-4a15-b078-0f7abe623431\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.141602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b8dfbb-ff67-4a15-b078-0f7abe623431-kube-api-access-w6mb6" (OuterVolumeSpecName: "kube-api-access-w6mb6") pod "29b8dfbb-ff67-4a15-b078-0f7abe623431" (UID: "29b8dfbb-ff67-4a15-b078-0f7abe623431"). InnerVolumeSpecName "kube-api-access-w6mb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.188267 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b6f7bc47c-7t9k4"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.190636 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "29b8dfbb-ff67-4a15-b078-0f7abe623431" (UID: "29b8dfbb-ff67-4a15-b078-0f7abe623431"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.193274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "29b8dfbb-ff67-4a15-b078-0f7abe623431" (UID: "29b8dfbb-ff67-4a15-b078-0f7abe623431"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208390 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data\") pod \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208533 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qt84\" (UniqueName: \"kubernetes.io/projected/724a51e1-b819-4615-8626-f2d5e69e6798-kube-api-access-4qt84\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208591 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-public-tls-certs\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-combined-ca-bundle\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208671 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-scripts\") pod \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208750 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-combined-ca-bundle\") pod \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208803 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-ovndb-tls-certs\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208860 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-internal-tls-certs\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208900 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-etc-machine-id\") pod \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208925 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8p7g\" (UniqueName: \"kubernetes.io/projected/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-kube-api-access-r8p7g\") pod \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208982 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-config\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.209013 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-httpd-config\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.209046 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data-custom\") pod \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.209669 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.209688 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.209703 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6mb6\" (UniqueName: \"kubernetes.io/projected/29b8dfbb-ff67-4a15-b078-0f7abe623431-kube-api-access-w6mb6\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.215075 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4de1f9bc-aa69-4351-a9c9-44f7b59deaea" (UID: "4de1f9bc-aa69-4351-a9c9-44f7b59deaea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.217611 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/724a51e1-b819-4615-8626-f2d5e69e6798-kube-api-access-4qt84" (OuterVolumeSpecName: "kube-api-access-4qt84") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "kube-api-access-4qt84". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.225412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4de1f9bc-aa69-4351-a9c9-44f7b59deaea" (UID: "4de1f9bc-aa69-4351-a9c9-44f7b59deaea"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.230775 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-scripts" (OuterVolumeSpecName: "scripts") pod "4de1f9bc-aa69-4351-a9c9-44f7b59deaea" (UID: "4de1f9bc-aa69-4351-a9c9-44f7b59deaea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.230878 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.235072 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-kube-api-access-r8p7g" (OuterVolumeSpecName: "kube-api-access-r8p7g") pod "4de1f9bc-aa69-4351-a9c9-44f7b59deaea" (UID: "4de1f9bc-aa69-4351-a9c9-44f7b59deaea"). InnerVolumeSpecName "kube-api-access-r8p7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.250475 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29b8dfbb-ff67-4a15-b078-0f7abe623431" (UID: "29b8dfbb-ff67-4a15-b078-0f7abe623431"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.256057 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29b8dfbb-ff67-4a15-b078-0f7abe623431" (UID: "29b8dfbb-ff67-4a15-b078-0f7abe623431"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.271076 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-config" (OuterVolumeSpecName: "config") pod "29b8dfbb-ff67-4a15-b078-0f7abe623431" (UID: "29b8dfbb-ff67-4a15-b078-0f7abe623431"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311525 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311560 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8p7g\" (UniqueName: \"kubernetes.io/projected/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-kube-api-access-r8p7g\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311570 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311581 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311590 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311598 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311610 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qt84\" (UniqueName: \"kubernetes.io/projected/724a51e1-b819-4615-8626-f2d5e69e6798-kube-api-access-4qt84\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311619 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311627 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.313011 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-config" (OuterVolumeSpecName: "config") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.334388 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.335459 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4de1f9bc-aa69-4351-a9c9-44f7b59deaea" (UID: "4de1f9bc-aa69-4351-a9c9-44f7b59deaea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.345733 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.348423 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.390943 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.416557 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.416590 4722 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.416599 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.416609 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.416619 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.416628 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.421148 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data" (OuterVolumeSpecName: "config-data") pod "4de1f9bc-aa69-4351-a9c9-44f7b59deaea" (UID: "4de1f9bc-aa69-4351-a9c9-44f7b59deaea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.518943 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.533669 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-f7nmr"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.570620 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b6f7bc47c-7t9k4" event={"ID":"d3b8803c-74dc-4932-9bdc-d45ca70103c4","Type":"ContainerStarted","Data":"321989e2d73a3267663d9620b3e60f2d9e5a9bac0112a52f3dd287ec6f466733"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.578512 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.578492 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4de1f9bc-aa69-4351-a9c9-44f7b59deaea","Type":"ContainerDied","Data":"3aac2365a2468889ade96c83ec75fcf98015fb0ef49093cd684c51ef45021eff"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.578887 4722 scope.go:117] "RemoveContainer" containerID="1041c4b882dc07bf08dafd3a5b1d68304c4445f920fb0ee35ea020a1f8def607" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.585783 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647dc79bf7-sr259" event={"ID":"724a51e1-b819-4615-8626-f2d5e69e6798","Type":"ContainerDied","Data":"2a0f4b08ed52374b7cc2281865a14714c885f9f0925762a095546b787fd0453f"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.585921 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.612725 4722 scope.go:117] "RemoveContainer" containerID="e362f4578c0cf105528b91e15abc8fa364be316e709acfdbb439f43e665d6510" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.613972 4722 generic.go:334] "Generic (PLEG): container finished" podID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerID="cea4826330b14d6f2739ac5e94f26ef917a6952f710ce432d23718b17afe25be" exitCode=0 Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.614036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b5cf9c6b-jmpww" event={"ID":"c916c2e2-18cb-4b79-ae01-4c977da93866","Type":"ContainerDied","Data":"cea4826330b14d6f2739ac5e94f26ef917a6952f710ce432d23718b17afe25be"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.643423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" event={"ID":"29b8dfbb-ff67-4a15-b078-0f7abe623431","Type":"ContainerDied","Data":"7967cef7aeddeb15162c1de8e5c92229cffd11f032094d514f5c0f541eb96ee7"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.643565 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.645908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-f7nmr" event={"ID":"e702637a-959c-4660-b2a0-dc4325119819","Type":"ContainerStarted","Data":"c0798bfbd66cad85ccbfddbe222d7874cfc77437ef3e1a9b391f1b12221f4a60"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.651992 4722 scope.go:117] "RemoveContainer" containerID="fac8fedd4f876a15ec465e90d03935b71e6621e396916ae5147c867d4c9a484e" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.657967 4722 generic.go:334] "Generic (PLEG): container finished" podID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerID="10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315" exitCode=2 Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.662943 4722 generic.go:334] "Generic (PLEG): container finished" podID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerID="e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d" exitCode=0 Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.659629 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerDied","Data":"10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.663014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerDied","Data":"e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.690532 4722 scope.go:117] "RemoveContainer" containerID="ab508fb68b314fd1c841ead8a41612709fe3cda3d4dc611dccaf5dabae8c1777" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.693236 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.709033 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.725998 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.741762 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-647dc79bf7-sr259"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.747318 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data-custom\") pod \"c916c2e2-18cb-4b79-ae01-4c977da93866\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.747414 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916c2e2-18cb-4b79-ae01-4c977da93866-logs\") pod \"c916c2e2-18cb-4b79-ae01-4c977da93866\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.747545 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-combined-ca-bundle\") pod \"c916c2e2-18cb-4b79-ae01-4c977da93866\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.747621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2jjm\" (UniqueName: \"kubernetes.io/projected/c916c2e2-18cb-4b79-ae01-4c977da93866-kube-api-access-k2jjm\") pod \"c916c2e2-18cb-4b79-ae01-4c977da93866\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.748123 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c916c2e2-18cb-4b79-ae01-4c977da93866-logs" (OuterVolumeSpecName: "logs") pod "c916c2e2-18cb-4b79-ae01-4c977da93866" (UID: "c916c2e2-18cb-4b79-ae01-4c977da93866"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.748377 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data\") pod \"c916c2e2-18cb-4b79-ae01-4c977da93866\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.749127 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916c2e2-18cb-4b79-ae01-4c977da93866-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.752935 4722 scope.go:117] "RemoveContainer" containerID="8b090bf3aaebcb88b0c2a76597cf1496e6eac0069a0cf428a88c4a6c7ab51500" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.754731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c916c2e2-18cb-4b79-ae01-4c977da93866" (UID: "c916c2e2-18cb-4b79-ae01-4c977da93866"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.758461 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c916c2e2-18cb-4b79-ae01-4c977da93866-kube-api-access-k2jjm" (OuterVolumeSpecName: "kube-api-access-k2jjm") pod "c916c2e2-18cb-4b79-ae01-4c977da93866" (UID: "c916c2e2-18cb-4b79-ae01-4c977da93866"). InnerVolumeSpecName "kube-api-access-k2jjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.765256 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-647dc79bf7-sr259"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.788284 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.788844 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-api" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.788874 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-api" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.788886 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="dnsmasq-dns" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.788894 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="dnsmasq-dns" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.788909 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="probe" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.788917 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="probe" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.788932 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="cinder-scheduler" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.788939 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="cinder-scheduler" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.788973 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.788982 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.788995 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789005 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.789026 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="init" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789037 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="init" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.789047 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-httpd" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789057 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-httpd" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789310 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="cinder-scheduler" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789332 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-api" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789354 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="probe" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789372 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789384 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="dnsmasq-dns" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789408 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-httpd" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789422 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.791006 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.793183 4722 scope.go:117] "RemoveContainer" containerID="a695eeeb295dd6f2121a919c4d6962b1fa2ad86a319d1c4f25385cdc0c97bfce" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.793611 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.800701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c916c2e2-18cb-4b79-ae01-4c977da93866" (UID: "c916c2e2-18cb-4b79-ae01-4c977da93866"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.829218 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.836469 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data" (OuterVolumeSpecName: "config-data") pod "c916c2e2-18cb-4b79-ae01-4c977da93866" (UID: "c916c2e2-18cb-4b79-ae01-4c977da93866"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.845534 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-lcmxp"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852249 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/116b7592-ce3d-44ff-94d9-2a16103f4058-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852314 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-config-data\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852366 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-scripts\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852406 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clbw\" (UniqueName: \"kubernetes.io/projected/116b7592-ce3d-44ff-94d9-2a16103f4058-kube-api-access-6clbw\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852461 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852472 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2jjm\" (UniqueName: \"kubernetes.io/projected/c916c2e2-18cb-4b79-ae01-4c977da93866-kube-api-access-k2jjm\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852481 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852489 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.863546 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-lcmxp"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954015 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6clbw\" (UniqueName: \"kubernetes.io/projected/116b7592-ce3d-44ff-94d9-2a16103f4058-kube-api-access-6clbw\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/116b7592-ce3d-44ff-94d9-2a16103f4058-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954476 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954601 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-config-data\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954794 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-scripts\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/116b7592-ce3d-44ff-94d9-2a16103f4058-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.958463 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.960019 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-config-data\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.962738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.962874 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-scripts\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.972692 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clbw\" (UniqueName: \"kubernetes.io/projected/116b7592-ce3d-44ff-94d9-2a16103f4058-kube-api-access-6clbw\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.129719 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.159225 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" path="/var/lib/kubelet/pods/29b8dfbb-ff67-4a15-b078-0f7abe623431/volumes" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.159943 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" path="/var/lib/kubelet/pods/4de1f9bc-aa69-4351-a9c9-44f7b59deaea/volumes" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.160864 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" path="/var/lib/kubelet/pods/724a51e1-b819-4615-8626-f2d5e69e6798/volumes" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.673534 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:52 crc kubenswrapper[4722]: W0226 20:14:52.675794 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod116b7592_ce3d_44ff_94d9_2a16103f4058.slice/crio-3a13731c6fe7f6681641ca0689a909623e4ddc4db1cd48b3a877464b50ce8382 WatchSource:0}: Error finding container 3a13731c6fe7f6681641ca0689a909623e4ddc4db1cd48b3a877464b50ce8382: Status 404 returned error can't find the container with id 3a13731c6fe7f6681641ca0689a909623e4ddc4db1cd48b3a877464b50ce8382 Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.676003 4722 generic.go:334] "Generic (PLEG): container finished" podID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerID="30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0" exitCode=0 Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.676078 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerDied","Data":"30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0"} Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.681700 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-f7nmr" event={"ID":"e702637a-959c-4660-b2a0-dc4325119819","Type":"ContainerStarted","Data":"78648c5124c5c41a097259e060a38a160dde3bbb1322966d64b1b455562baa7d"} Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.689577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b6f7bc47c-7t9k4" event={"ID":"d3b8803c-74dc-4932-9bdc-d45ca70103c4","Type":"ContainerStarted","Data":"92705017754fd6e9a04ada92e89447d98c7c5a806aa38b07178cebf756c7fb1c"} Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.689651 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b6f7bc47c-7t9k4" event={"ID":"d3b8803c-74dc-4932-9bdc-d45ca70103c4","Type":"ContainerStarted","Data":"7d41e018afa7038f64a036e9d071fb6b9d7e072e05693977b65c146a6f8c4695"} Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.690853 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.711218 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b5cf9c6b-jmpww" event={"ID":"c916c2e2-18cb-4b79-ae01-4c977da93866","Type":"ContainerDied","Data":"c9990e01502993d45ebaff45cdf841da6914f82b8c7a3bcc5eaaf53c3ae7492d"} Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.711299 4722 scope.go:117] "RemoveContainer" containerID="cea4826330b14d6f2739ac5e94f26ef917a6952f710ce432d23718b17afe25be" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.711480 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.718000 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-f7nmr" podStartSLOduration=2.717984452 podStartE2EDuration="2.717984452s" podCreationTimestamp="2026-02-26 20:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:52.716533352 +0000 UTC m=+1235.253501276" watchObservedRunningTime="2026-02-26 20:14:52.717984452 +0000 UTC m=+1235.254952376" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.775476 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b6f7bc47c-7t9k4" podStartSLOduration=11.775459708 podStartE2EDuration="11.775459708s" podCreationTimestamp="2026-02-26 20:14:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:52.765557048 +0000 UTC m=+1235.302524972" watchObservedRunningTime="2026-02-26 20:14:52.775459708 +0000 UTC m=+1235.312427632" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.854378 4722 scope.go:117] "RemoveContainer" containerID="58e54c8749ba66b68213d7acc2fbd7148660e6f41229809b23505c309a5a7f2d" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.854511 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69b5cf9c6b-jmpww"] Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.866796 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-69b5cf9c6b-jmpww"] Feb 26 20:14:53 crc kubenswrapper[4722]: I0226 20:14:53.724788 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"116b7592-ce3d-44ff-94d9-2a16103f4058","Type":"ContainerStarted","Data":"3c6a0581dce96e8ec909c17265fb02b511cc0eb724891e41cf0f9a8ecbc0f132"} Feb 26 20:14:53 crc kubenswrapper[4722]: I0226 20:14:53.725170 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"116b7592-ce3d-44ff-94d9-2a16103f4058","Type":"ContainerStarted","Data":"3a13731c6fe7f6681641ca0689a909623e4ddc4db1cd48b3a877464b50ce8382"} Feb 26 20:14:54 crc kubenswrapper[4722]: I0226 20:14:54.175905 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" path="/var/lib/kubelet/pods/c916c2e2-18cb-4b79-ae01-4c977da93866/volumes" Feb 26 20:14:54 crc kubenswrapper[4722]: I0226 20:14:54.742441 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"116b7592-ce3d-44ff-94d9-2a16103f4058","Type":"ContainerStarted","Data":"7d5f706b7235c9ab8735fddb8561aac7c06db2b6d0208016e6a7163acf1aef09"} Feb 26 20:14:54 crc kubenswrapper[4722]: I0226 20:14:54.744197 4722 generic.go:334] "Generic (PLEG): container finished" podID="e702637a-959c-4660-b2a0-dc4325119819" containerID="78648c5124c5c41a097259e060a38a160dde3bbb1322966d64b1b455562baa7d" exitCode=0 Feb 26 20:14:54 crc kubenswrapper[4722]: I0226 20:14:54.744262 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-f7nmr" event={"ID":"e702637a-959c-4660-b2a0-dc4325119819","Type":"ContainerDied","Data":"78648c5124c5c41a097259e060a38a160dde3bbb1322966d64b1b455562baa7d"} Feb 26 20:14:54 crc kubenswrapper[4722]: I0226 20:14:54.767529 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.767514282 podStartE2EDuration="3.767514282s" podCreationTimestamp="2026-02-26 20:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:54.761130889 +0000 UTC m=+1237.298098813" watchObservedRunningTime="2026-02-26 20:14:54.767514282 +0000 UTC m=+1237.304482206" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.139090 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.214200 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.342073 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-scripts\") pod \"e702637a-959c-4660-b2a0-dc4325119819\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.342155 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-combined-ca-bundle\") pod \"e702637a-959c-4660-b2a0-dc4325119819\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.342217 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlwms\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-kube-api-access-zlwms\") pod \"e702637a-959c-4660-b2a0-dc4325119819\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.342286 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-config-data\") pod \"e702637a-959c-4660-b2a0-dc4325119819\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.342309 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-certs\") pod \"e702637a-959c-4660-b2a0-dc4325119819\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.349291 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-certs" (OuterVolumeSpecName: "certs") pod "e702637a-959c-4660-b2a0-dc4325119819" (UID: "e702637a-959c-4660-b2a0-dc4325119819"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.359574 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-scripts" (OuterVolumeSpecName: "scripts") pod "e702637a-959c-4660-b2a0-dc4325119819" (UID: "e702637a-959c-4660-b2a0-dc4325119819"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.359665 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-kube-api-access-zlwms" (OuterVolumeSpecName: "kube-api-access-zlwms") pod "e702637a-959c-4660-b2a0-dc4325119819" (UID: "e702637a-959c-4660-b2a0-dc4325119819"). InnerVolumeSpecName "kube-api-access-zlwms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.374355 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-config-data" (OuterVolumeSpecName: "config-data") pod "e702637a-959c-4660-b2a0-dc4325119819" (UID: "e702637a-959c-4660-b2a0-dc4325119819"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.377821 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e702637a-959c-4660-b2a0-dc4325119819" (UID: "e702637a-959c-4660-b2a0-dc4325119819"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.444299 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.444328 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.444340 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlwms\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-kube-api-access-zlwms\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.444350 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.444364 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.731323 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.771421 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-f7nmr" event={"ID":"e702637a-959c-4660-b2a0-dc4325119819","Type":"ContainerDied","Data":"c0798bfbd66cad85ccbfddbe222d7874cfc77437ef3e1a9b391f1b12221f4a60"} Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.771478 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0798bfbd66cad85ccbfddbe222d7874cfc77437ef3e1a9b391f1b12221f4a60" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.771502 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.094779 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:14:57 crc kubenswrapper[4722]: E0226 20:14:57.095443 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e702637a-959c-4660-b2a0-dc4325119819" containerName="cloudkitty-storageinit" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.095459 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e702637a-959c-4660-b2a0-dc4325119819" containerName="cloudkitty-storageinit" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.095664 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e702637a-959c-4660-b2a0-dc4325119819" containerName="cloudkitty-storageinit" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.097450 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.099841 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.100078 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-k7xwb" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.100289 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.102441 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.103373 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.125084 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.131000 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.210195 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86d9875b97-6blv4"] Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.211992 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.231196 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d9875b97-6blv4"] Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.268392 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-scripts\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.268599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.268627 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjft\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-kube-api-access-fjjft\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.268665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.268698 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-certs\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.268724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.370888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-scripts\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.370939 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-nb\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371007 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-config\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371085 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-sb\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371118 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-swift-storage-0\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371186 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371208 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjft\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-kube-api-access-fjjft\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371234 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzs62\" (UniqueName: \"kubernetes.io/projected/fc52c422-c3c5-4b3d-81a3-57ee15cca146-kube-api-access-zzs62\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371263 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-svc\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371322 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-certs\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.410201 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.413046 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.413410 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-scripts\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.414605 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjft\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-kube-api-access-fjjft\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.415736 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-certs\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.431351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.447158 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.448904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.452742 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.461078 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.473396 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-config\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.473487 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-sb\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.473515 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-swift-storage-0\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.473557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzs62\" (UniqueName: \"kubernetes.io/projected/fc52c422-c3c5-4b3d-81a3-57ee15cca146-kube-api-access-zzs62\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.473582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-svc\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.473679 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-nb\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.474478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-config\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.474490 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-nb\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.475122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-sb\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.475811 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-swift-storage-0\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.475833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-svc\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.504700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzs62\" (UniqueName: \"kubernetes.io/projected/fc52c422-c3c5-4b3d-81a3-57ee15cca146-kube-api-access-zzs62\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.575785 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.575853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092ac32d-651b-4cf2-af8e-a028eeea8006-logs\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.576055 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.576129 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-scripts\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.576245 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-certs\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.576299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.576356 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stkfp\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-kube-api-access-stkfp\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.579052 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677680 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092ac32d-651b-4cf2-af8e-a028eeea8006-logs\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677783 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677803 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-scripts\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677837 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-certs\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677872 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stkfp\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-kube-api-access-stkfp\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.680050 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092ac32d-651b-4cf2-af8e-a028eeea8006-logs\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.684813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.685422 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-certs\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.685757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-scripts\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.687237 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.697650 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stkfp\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-kube-api-access-stkfp\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.700010 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.727784 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.867805 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.136932 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d9875b97-6blv4"] Feb 26 20:14:58 crc kubenswrapper[4722]: W0226 20:14:58.144080 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc52c422_c3c5_4b3d_81a3_57ee15cca146.slice/crio-37302d91603306e61913bbe72a80e84ba2475c858ede9eaec79768fbeb23ef16 WatchSource:0}: Error finding container 37302d91603306e61913bbe72a80e84ba2475c858ede9eaec79768fbeb23ef16: Status 404 returned error can't find the container with id 37302d91603306e61913bbe72a80e84ba2475c858ede9eaec79768fbeb23ef16 Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.363391 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:14:58 crc kubenswrapper[4722]: W0226 20:14:58.367250 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeff8cc6d_9c70_4b9b_ad9d_d8314b786523.slice/crio-362fbf15d08cf5c5dd7b73d6a853a13d77e4e40910ca45dc22eedb3f3bcb457d WatchSource:0}: Error finding container 362fbf15d08cf5c5dd7b73d6a853a13d77e4e40910ca45dc22eedb3f3bcb457d: Status 404 returned error can't find the container with id 362fbf15d08cf5c5dd7b73d6a853a13d77e4e40910ca45dc22eedb3f3bcb457d Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.516494 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.799697 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"eff8cc6d-9c70-4b9b-ad9d-d8314b786523","Type":"ContainerStarted","Data":"362fbf15d08cf5c5dd7b73d6a853a13d77e4e40910ca45dc22eedb3f3bcb457d"} Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.807320 4722 generic.go:334] "Generic (PLEG): container finished" podID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerID="410c8bc811f8dc3b536538d081ec443c4b536a42a23ddcc9c1ed1f0f771b5206" exitCode=0 Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.807426 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" event={"ID":"fc52c422-c3c5-4b3d-81a3-57ee15cca146","Type":"ContainerDied","Data":"410c8bc811f8dc3b536538d081ec443c4b536a42a23ddcc9c1ed1f0f771b5206"} Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.807454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" event={"ID":"fc52c422-c3c5-4b3d-81a3-57ee15cca146","Type":"ContainerStarted","Data":"37302d91603306e61913bbe72a80e84ba2475c858ede9eaec79768fbeb23ef16"} Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.813484 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"092ac32d-651b-4cf2-af8e-a028eeea8006","Type":"ContainerStarted","Data":"636de1361a6420712977d06514dbb6bfce02fcc82b4a55ecd253c9f664cb3d19"} Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.813537 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"092ac32d-651b-4cf2-af8e-a028eeea8006","Type":"ContainerStarted","Data":"576dfa8e71b7984aec29d6e84a2da09757dfcad700ac425d1ab6815470e9db32"} Feb 26 20:14:59 crc kubenswrapper[4722]: I0226 20:14:59.837547 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"092ac32d-651b-4cf2-af8e-a028eeea8006","Type":"ContainerStarted","Data":"c70e2481bb9f3d0f851cceaa41579c82060356923e0bad3c74c2343db76909f3"} Feb 26 20:14:59 crc kubenswrapper[4722]: I0226 20:14:59.839393 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 26 20:14:59 crc kubenswrapper[4722]: I0226 20:14:59.842471 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" event={"ID":"fc52c422-c3c5-4b3d-81a3-57ee15cca146","Type":"ContainerStarted","Data":"0de296d1c6f3faa11ee9a2a5910d2c4b8e64c6796013674ed3e9c96393c3abe9"} Feb 26 20:14:59 crc kubenswrapper[4722]: I0226 20:14:59.842665 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:59 crc kubenswrapper[4722]: I0226 20:14:59.872038 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.872017073 podStartE2EDuration="2.872017073s" podCreationTimestamp="2026-02-26 20:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:59.866905424 +0000 UTC m=+1242.403873368" watchObservedRunningTime="2026-02-26 20:14:59.872017073 +0000 UTC m=+1242.408984997" Feb 26 20:14:59 crc kubenswrapper[4722]: I0226 20:14:59.901823 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" podStartSLOduration=2.901805344 podStartE2EDuration="2.901805344s" podCreationTimestamp="2026-02-26 20:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:59.897483667 +0000 UTC m=+1242.434451611" watchObservedRunningTime="2026-02-26 20:14:59.901805344 +0000 UTC m=+1242.438773268" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.141595 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.143173 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.145663 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.152030 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.162548 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.244637 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g5vd\" (UniqueName: \"kubernetes.io/projected/0c234a00-8cb1-4bfb-906d-05e2d12f8222-kube-api-access-7g5vd\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.245038 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c234a00-8cb1-4bfb-906d-05e2d12f8222-secret-volume\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.245201 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c234a00-8cb1-4bfb-906d-05e2d12f8222-config-volume\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.265289 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.350267 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g5vd\" (UniqueName: \"kubernetes.io/projected/0c234a00-8cb1-4bfb-906d-05e2d12f8222-kube-api-access-7g5vd\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.350425 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c234a00-8cb1-4bfb-906d-05e2d12f8222-secret-volume\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.350543 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c234a00-8cb1-4bfb-906d-05e2d12f8222-config-volume\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.352194 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c234a00-8cb1-4bfb-906d-05e2d12f8222-config-volume\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.357338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c234a00-8cb1-4bfb-906d-05e2d12f8222-secret-volume\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.384085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g5vd\" (UniqueName: \"kubernetes.io/projected/0c234a00-8cb1-4bfb-906d-05e2d12f8222-kube-api-access-7g5vd\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.399534 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.400863 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.404800 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.404841 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-gcfkm" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.404850 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.414560 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.483417 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.555732 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.555847 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6sd8\" (UniqueName: \"kubernetes.io/projected/e49d6d32-784a-444a-8866-fb6dc83878c5-kube-api-access-p6sd8\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.555914 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.556545 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config-secret\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.658058 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config-secret\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.658157 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.658194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6sd8\" (UniqueName: \"kubernetes.io/projected/e49d6d32-784a-444a-8866-fb6dc83878c5-kube-api-access-p6sd8\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.658232 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.659560 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.662450 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config-secret\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.663109 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.690114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6sd8\" (UniqueName: \"kubernetes.io/projected/e49d6d32-784a-444a-8866-fb6dc83878c5-kube-api-access-p6sd8\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.778836 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.780517 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.816224 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.834327 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.837571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.847200 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.889357 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"eff8cc6d-9c70-4b9b-ad9d-d8314b786523","Type":"ContainerStarted","Data":"886fefce1962badcca659f7d2036cabb1729cb74b946e6cf2dbc463d73ee51fc"} Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.910654 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.3753611980000002 podStartE2EDuration="3.910632345s" podCreationTimestamp="2026-02-26 20:14:57 +0000 UTC" firstStartedPulling="2026-02-26 20:14:58.371779661 +0000 UTC m=+1240.908747585" lastFinishedPulling="2026-02-26 20:14:59.907050798 +0000 UTC m=+1242.444018732" observedRunningTime="2026-02-26 20:15:00.91005428 +0000 UTC m=+1243.447022214" watchObservedRunningTime="2026-02-26 20:15:00.910632345 +0000 UTC m=+1243.447600279" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.947002 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.967096 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.967221 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-openstack-config\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.967420 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t77x5\" (UniqueName: \"kubernetes.io/projected/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-kube-api-access-t77x5\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.967538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-openstack-config-secret\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: E0226 20:15:01.009049 4722 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 26 20:15:01 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_e49d6d32-784a-444a-8866-fb6dc83878c5_0(30a3ccdb0c2ce10e10b5e9f95086c4f1c2c12ad8d987d81f3ad71aef325af9a8): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"30a3ccdb0c2ce10e10b5e9f95086c4f1c2c12ad8d987d81f3ad71aef325af9a8" Netns:"/var/run/netns/52a4d4ce-e5e9-44b3-a72c-45c3c431b4b5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=30a3ccdb0c2ce10e10b5e9f95086c4f1c2c12ad8d987d81f3ad71aef325af9a8;K8S_POD_UID=e49d6d32-784a-444a-8866-fb6dc83878c5" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/e49d6d32-784a-444a-8866-fb6dc83878c5]: expected pod UID "e49d6d32-784a-444a-8866-fb6dc83878c5" but got "0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d" from Kube API Feb 26 20:15:01 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 20:15:01 crc kubenswrapper[4722]: > Feb 26 20:15:01 crc kubenswrapper[4722]: E0226 20:15:01.009155 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 26 20:15:01 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_e49d6d32-784a-444a-8866-fb6dc83878c5_0(30a3ccdb0c2ce10e10b5e9f95086c4f1c2c12ad8d987d81f3ad71aef325af9a8): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"30a3ccdb0c2ce10e10b5e9f95086c4f1c2c12ad8d987d81f3ad71aef325af9a8" Netns:"/var/run/netns/52a4d4ce-e5e9-44b3-a72c-45c3c431b4b5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=30a3ccdb0c2ce10e10b5e9f95086c4f1c2c12ad8d987d81f3ad71aef325af9a8;K8S_POD_UID=e49d6d32-784a-444a-8866-fb6dc83878c5" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/e49d6d32-784a-444a-8866-fb6dc83878c5]: expected pod UID "e49d6d32-784a-444a-8866-fb6dc83878c5" but got "0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d" from Kube API Feb 26 20:15:01 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 20:15:01 crc kubenswrapper[4722]: > pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.049631 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk"] Feb 26 20:15:01 crc kubenswrapper[4722]: W0226 20:15:01.055654 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c234a00_8cb1_4bfb_906d_05e2d12f8222.slice/crio-c2cd60f797b8d6be49550cf40a1135b40776852996f465acabd4db07373febbe WatchSource:0}: Error finding container c2cd60f797b8d6be49550cf40a1135b40776852996f465acabd4db07373febbe: Status 404 returned error can't find the container with id c2cd60f797b8d6be49550cf40a1135b40776852996f465acabd4db07373febbe Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.071861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.071941 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-openstack-config\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.072344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t77x5\" (UniqueName: \"kubernetes.io/projected/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-kube-api-access-t77x5\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.072404 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-openstack-config-secret\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.072793 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-openstack-config\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.077770 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-openstack-config-secret\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.080655 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.089261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t77x5\" (UniqueName: \"kubernetes.io/projected/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-kube-api-access-t77x5\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.168610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: W0226 20:15:01.753490 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0baf16e3_5ab0_4c5f_a6b7_b404fd878c7d.slice/crio-946c75304a6fa4d137d16c4623c10c3244b60d3472de31bb6d98034104ff60e0 WatchSource:0}: Error finding container 946c75304a6fa4d137d16c4623c10c3244b60d3472de31bb6d98034104ff60e0: Status 404 returned error can't find the container with id 946c75304a6fa4d137d16c4623c10c3244b60d3472de31bb6d98034104ff60e0 Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.755105 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.901808 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d","Type":"ContainerStarted","Data":"946c75304a6fa4d137d16c4623c10c3244b60d3472de31bb6d98034104ff60e0"} Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.903822 4722 generic.go:334] "Generic (PLEG): container finished" podID="0c234a00-8cb1-4bfb-906d-05e2d12f8222" containerID="7af011a7c447aa639bf21f7108e4308a96e92ebeb95c177a6c0f3dcbc7e49422" exitCode=0 Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.903918 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" event={"ID":"0c234a00-8cb1-4bfb-906d-05e2d12f8222","Type":"ContainerDied","Data":"7af011a7c447aa639bf21f7108e4308a96e92ebeb95c177a6c0f3dcbc7e49422"} Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.903991 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" event={"ID":"0c234a00-8cb1-4bfb-906d-05e2d12f8222","Type":"ContainerStarted","Data":"c2cd60f797b8d6be49550cf40a1135b40776852996f465acabd4db07373febbe"} Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.903989 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.904608 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api-log" containerID="cri-o://636de1361a6420712977d06514dbb6bfce02fcc82b4a55ecd253c9f664cb3d19" gracePeriod=30 Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.904681 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api" containerID="cri-o://c70e2481bb9f3d0f851cceaa41579c82060356923e0bad3c74c2343db76909f3" gracePeriod=30 Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.918317 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.932858 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e49d6d32-784a-444a-8866-fb6dc83878c5" podUID="0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.989442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config\") pod \"e49d6d32-784a-444a-8866-fb6dc83878c5\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.990190 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e49d6d32-784a-444a-8866-fb6dc83878c5" (UID: "e49d6d32-784a-444a-8866-fb6dc83878c5"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.990394 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-combined-ca-bundle\") pod \"e49d6d32-784a-444a-8866-fb6dc83878c5\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.990476 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config-secret\") pod \"e49d6d32-784a-444a-8866-fb6dc83878c5\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.990561 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6sd8\" (UniqueName: \"kubernetes.io/projected/e49d6d32-784a-444a-8866-fb6dc83878c5-kube-api-access-p6sd8\") pod \"e49d6d32-784a-444a-8866-fb6dc83878c5\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.991279 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.001285 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e49d6d32-784a-444a-8866-fb6dc83878c5" (UID: "e49d6d32-784a-444a-8866-fb6dc83878c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.001567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e49d6d32-784a-444a-8866-fb6dc83878c5" (UID: "e49d6d32-784a-444a-8866-fb6dc83878c5"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.019921 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49d6d32-784a-444a-8866-fb6dc83878c5-kube-api-access-p6sd8" (OuterVolumeSpecName: "kube-api-access-p6sd8") pod "e49d6d32-784a-444a-8866-fb6dc83878c5" (UID: "e49d6d32-784a-444a-8866-fb6dc83878c5"). InnerVolumeSpecName "kube-api-access-p6sd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.092932 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.092970 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.092982 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6sd8\" (UniqueName: \"kubernetes.io/projected/e49d6d32-784a-444a-8866-fb6dc83878c5-kube-api-access-p6sd8\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.162029 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49d6d32-784a-444a-8866-fb6dc83878c5" path="/var/lib/kubelet/pods/e49d6d32-784a-444a-8866-fb6dc83878c5/volumes" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.591250 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.940916 4722 generic.go:334] "Generic (PLEG): container finished" podID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerID="c70e2481bb9f3d0f851cceaa41579c82060356923e0bad3c74c2343db76909f3" exitCode=0 Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.940965 4722 generic.go:334] "Generic (PLEG): container finished" podID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerID="636de1361a6420712977d06514dbb6bfce02fcc82b4a55ecd253c9f664cb3d19" exitCode=143 Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.941174 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"092ac32d-651b-4cf2-af8e-a028eeea8006","Type":"ContainerDied","Data":"c70e2481bb9f3d0f851cceaa41579c82060356923e0bad3c74c2343db76909f3"} Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.941199 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"092ac32d-651b-4cf2-af8e-a028eeea8006","Type":"ContainerDied","Data":"636de1361a6420712977d06514dbb6bfce02fcc82b4a55ecd253c9f664cb3d19"} Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.941325 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="eff8cc6d-9c70-4b9b-ad9d-d8314b786523" containerName="cloudkitty-proc" containerID="cri-o://886fefce1962badcca659f7d2036cabb1729cb74b946e6cf2dbc463d73ee51fc" gracePeriod=30 Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.941637 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.067556 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e49d6d32-784a-444a-8866-fb6dc83878c5" podUID="0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.073097 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.126845 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-certs\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.127454 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-scripts\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.127477 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-combined-ca-bundle\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.127561 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stkfp\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-kube-api-access-stkfp\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.127594 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data-custom\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.127622 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092ac32d-651b-4cf2-af8e-a028eeea8006-logs\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.127639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.141621 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/092ac32d-651b-4cf2-af8e-a028eeea8006-logs" (OuterVolumeSpecName: "logs") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.163972 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-kube-api-access-stkfp" (OuterVolumeSpecName: "kube-api-access-stkfp") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "kube-api-access-stkfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.164340 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-scripts" (OuterVolumeSpecName: "scripts") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.164731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-certs" (OuterVolumeSpecName: "certs") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.174357 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.198645 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data" (OuterVolumeSpecName: "config-data") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.205379 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236110 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236178 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236196 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stkfp\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-kube-api-access-stkfp\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236208 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236218 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092ac32d-651b-4cf2-af8e-a028eeea8006-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236227 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236252 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.538876 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.644729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c234a00-8cb1-4bfb-906d-05e2d12f8222-secret-volume\") pod \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.644782 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c234a00-8cb1-4bfb-906d-05e2d12f8222-config-volume\") pod \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.644970 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g5vd\" (UniqueName: \"kubernetes.io/projected/0c234a00-8cb1-4bfb-906d-05e2d12f8222-kube-api-access-7g5vd\") pod \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.645557 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c234a00-8cb1-4bfb-906d-05e2d12f8222-config-volume" (OuterVolumeSpecName: "config-volume") pod "0c234a00-8cb1-4bfb-906d-05e2d12f8222" (UID: "0c234a00-8cb1-4bfb-906d-05e2d12f8222"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.651066 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c234a00-8cb1-4bfb-906d-05e2d12f8222-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0c234a00-8cb1-4bfb-906d-05e2d12f8222" (UID: "0c234a00-8cb1-4bfb-906d-05e2d12f8222"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.654331 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c234a00-8cb1-4bfb-906d-05e2d12f8222-kube-api-access-7g5vd" (OuterVolumeSpecName: "kube-api-access-7g5vd") pod "0c234a00-8cb1-4bfb-906d-05e2d12f8222" (UID: "0c234a00-8cb1-4bfb-906d-05e2d12f8222"). InnerVolumeSpecName "kube-api-access-7g5vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.747240 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c234a00-8cb1-4bfb-906d-05e2d12f8222-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.747288 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c234a00-8cb1-4bfb-906d-05e2d12f8222-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.747299 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g5vd\" (UniqueName: \"kubernetes.io/projected/0c234a00-8cb1-4bfb-906d-05e2d12f8222-kube-api-access-7g5vd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.961685 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"092ac32d-651b-4cf2-af8e-a028eeea8006","Type":"ContainerDied","Data":"576dfa8e71b7984aec29d6e84a2da09757dfcad700ac425d1ab6815470e9db32"} Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.962298 4722 scope.go:117] "RemoveContainer" containerID="c70e2481bb9f3d0f851cceaa41579c82060356923e0bad3c74c2343db76909f3" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.962510 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.980581 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" event={"ID":"0c234a00-8cb1-4bfb-906d-05e2d12f8222","Type":"ContainerDied","Data":"c2cd60f797b8d6be49550cf40a1135b40776852996f465acabd4db07373febbe"} Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.980631 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.980650 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2cd60f797b8d6be49550cf40a1135b40776852996f465acabd4db07373febbe" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.028740 4722 scope.go:117] "RemoveContainer" containerID="636de1361a6420712977d06514dbb6bfce02fcc82b4a55ecd253c9f664cb3d19" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.066187 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.102002 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125247 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:15:04 crc kubenswrapper[4722]: E0226 20:15:04.125677 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api-log" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125693 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api-log" Feb 26 20:15:04 crc kubenswrapper[4722]: E0226 20:15:04.125707 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c234a00-8cb1-4bfb-906d-05e2d12f8222" containerName="collect-profiles" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125733 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c234a00-8cb1-4bfb-906d-05e2d12f8222" containerName="collect-profiles" Feb 26 20:15:04 crc kubenswrapper[4722]: E0226 20:15:04.125745 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125751 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125948 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125959 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c234a00-8cb1-4bfb-906d-05e2d12f8222" containerName="collect-profiles" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125982 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api-log" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.127070 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.132840 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.133012 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.133169 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.143539 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165527 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165694 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-logs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pstwf\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-kube-api-access-pstwf\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165930 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.166016 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-scripts\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.166156 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.243315 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" path="/var/lib/kubelet/pods/092ac32d-651b-4cf2-af8e-a028eeea8006/volumes" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.268690 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.268978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269085 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269212 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-logs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pstwf\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-kube-api-access-pstwf\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269494 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269575 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269656 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-scripts\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.273425 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-logs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.277070 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.277635 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.277835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.278003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.280618 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.287473 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.308822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pstwf\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-kube-api-access-pstwf\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.309439 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-scripts\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.481014 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:15:05 crc kubenswrapper[4722]: I0226 20:15:05.183798 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:15:05 crc kubenswrapper[4722]: E0226 20:15:05.925018 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeff8cc6d_9c70_4b9b_ad9d_d8314b786523.slice/crio-886fefce1962badcca659f7d2036cabb1729cb74b946e6cf2dbc463d73ee51fc.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.034885 4722 generic.go:334] "Generic (PLEG): container finished" podID="eff8cc6d-9c70-4b9b-ad9d-d8314b786523" containerID="886fefce1962badcca659f7d2036cabb1729cb74b946e6cf2dbc463d73ee51fc" exitCode=0 Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.034948 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"eff8cc6d-9c70-4b9b-ad9d-d8314b786523","Type":"ContainerDied","Data":"886fefce1962badcca659f7d2036cabb1729cb74b946e6cf2dbc463d73ee51fc"} Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.036602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52c8d648-e7a4-40c9-8db8-a8f5e4007d31","Type":"ContainerStarted","Data":"a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644"} Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.036644 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52c8d648-e7a4-40c9-8db8-a8f5e4007d31","Type":"ContainerStarted","Data":"068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e"} Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.036654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52c8d648-e7a4-40c9-8db8-a8f5e4007d31","Type":"ContainerStarted","Data":"fc5320da3d9a270e99a8cf10b9849b44fb32d59bacc00c14b98e2cdd4eb56b17"} Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.036813 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.055740 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.055722942 podStartE2EDuration="2.055722942s" podCreationTimestamp="2026-02-26 20:15:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:06.054964492 +0000 UTC m=+1248.591932416" watchObservedRunningTime="2026-02-26 20:15:06.055722942 +0000 UTC m=+1248.592690866" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.358274 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.427521 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjjft\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-kube-api-access-fjjft\") pod \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.427572 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-scripts\") pod \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.427695 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data\") pod \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.427727 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data-custom\") pod \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.427798 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-certs\") pod \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.427908 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-combined-ca-bundle\") pod \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.443303 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-kube-api-access-fjjft" (OuterVolumeSpecName: "kube-api-access-fjjft") pod "eff8cc6d-9c70-4b9b-ad9d-d8314b786523" (UID: "eff8cc6d-9c70-4b9b-ad9d-d8314b786523"). InnerVolumeSpecName "kube-api-access-fjjft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.443429 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-scripts" (OuterVolumeSpecName: "scripts") pod "eff8cc6d-9c70-4b9b-ad9d-d8314b786523" (UID: "eff8cc6d-9c70-4b9b-ad9d-d8314b786523"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.443509 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-certs" (OuterVolumeSpecName: "certs") pod "eff8cc6d-9c70-4b9b-ad9d-d8314b786523" (UID: "eff8cc6d-9c70-4b9b-ad9d-d8314b786523"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.456350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eff8cc6d-9c70-4b9b-ad9d-d8314b786523" (UID: "eff8cc6d-9c70-4b9b-ad9d-d8314b786523"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.514080 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eff8cc6d-9c70-4b9b-ad9d-d8314b786523" (UID: "eff8cc6d-9c70-4b9b-ad9d-d8314b786523"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.520001 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data" (OuterVolumeSpecName: "config-data") pod "eff8cc6d-9c70-4b9b-ad9d-d8314b786523" (UID: "eff8cc6d-9c70-4b9b-ad9d-d8314b786523"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.530671 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjjft\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-kube-api-access-fjjft\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.530701 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.530714 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.530725 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.530736 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.530746 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.052328 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.052608 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"eff8cc6d-9c70-4b9b-ad9d-d8314b786523","Type":"ContainerDied","Data":"362fbf15d08cf5c5dd7b73d6a853a13d77e4e40910ca45dc22eedb3f3bcb457d"} Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.053324 4722 scope.go:117] "RemoveContainer" containerID="886fefce1962badcca659f7d2036cabb1729cb74b946e6cf2dbc463d73ee51fc" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.089816 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.099905 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.109001 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:15:07 crc kubenswrapper[4722]: E0226 20:15:07.109404 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff8cc6d-9c70-4b9b-ad9d-d8314b786523" containerName="cloudkitty-proc" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.109423 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff8cc6d-9c70-4b9b-ad9d-d8314b786523" containerName="cloudkitty-proc" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.109622 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff8cc6d-9c70-4b9b-ad9d-d8314b786523" containerName="cloudkitty-proc" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.110666 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.117561 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.123628 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.158831 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.158892 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-scripts\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.158931 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.158949 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-certs\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.159028 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.160896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn9z8\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-kube-api-access-xn9z8\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.271386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn9z8\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-kube-api-access-xn9z8\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.271516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.271550 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-scripts\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.271578 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.271595 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-certs\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.271663 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.279281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.288685 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-scripts\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.294917 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-certs\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.295893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.296679 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.297128 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn9z8\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-kube-api-access-xn9z8\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.433906 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.580659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.716131 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-fdfqf"] Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.716519 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerName="dnsmasq-dns" containerID="cri-o://cddb49ca061ca66fd7aadb40dcd0c74ad46143b2e2ce35bcfc4f7f6eaebb9ac2" gracePeriod=10 Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.055789 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.077501 4722 generic.go:334] "Generic (PLEG): container finished" podID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerID="cddb49ca061ca66fd7aadb40dcd0c74ad46143b2e2ce35bcfc4f7f6eaebb9ac2" exitCode=0 Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.077555 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" event={"ID":"78cc33bd-e962-4121-8a5d-0e75ba60fdf3","Type":"ContainerDied","Data":"cddb49ca061ca66fd7aadb40dcd0c74ad46143b2e2ce35bcfc4f7f6eaebb9ac2"} Feb 26 20:15:08 crc kubenswrapper[4722]: W0226 20:15:08.099727 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0705108_f020_43bc_a1af_7edae5a50927.slice/crio-d984510d4fd3fa39c044bbc7baf8be1f9033dbe04210ca783df64e4685010a74 WatchSource:0}: Error finding container d984510d4fd3fa39c044bbc7baf8be1f9033dbe04210ca783df64e4685010a74: Status 404 returned error can't find the container with id d984510d4fd3fa39c044bbc7baf8be1f9033dbe04210ca783df64e4685010a74 Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.179921 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff8cc6d-9c70-4b9b-ad9d-d8314b786523" path="/var/lib/kubelet/pods/eff8cc6d-9c70-4b9b-ad9d-d8314b786523/volumes" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.837109 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.861347 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.915168 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-svc\") pod \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.915404 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-nb\") pod \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.915533 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-sb\") pod \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.915615 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-config\") pod \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.915650 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfjx9\" (UniqueName: \"kubernetes.io/projected/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-kube-api-access-tfjx9\") pod \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.915691 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-swift-storage-0\") pod \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.929362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-kube-api-access-tfjx9" (OuterVolumeSpecName: "kube-api-access-tfjx9") pod "78cc33bd-e962-4121-8a5d-0e75ba60fdf3" (UID: "78cc33bd-e962-4121-8a5d-0e75ba60fdf3"). InnerVolumeSpecName "kube-api-access-tfjx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.988547 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "78cc33bd-e962-4121-8a5d-0e75ba60fdf3" (UID: "78cc33bd-e962-4121-8a5d-0e75ba60fdf3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.991357 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "78cc33bd-e962-4121-8a5d-0e75ba60fdf3" (UID: "78cc33bd-e962-4121-8a5d-0e75ba60fdf3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.991705 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "78cc33bd-e962-4121-8a5d-0e75ba60fdf3" (UID: "78cc33bd-e962-4121-8a5d-0e75ba60fdf3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.993759 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "78cc33bd-e962-4121-8a5d-0e75ba60fdf3" (UID: "78cc33bd-e962-4121-8a5d-0e75ba60fdf3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.020700 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.020733 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.020743 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfjx9\" (UniqueName: \"kubernetes.io/projected/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-kube-api-access-tfjx9\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.020754 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.020762 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.054668 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-config" (OuterVolumeSpecName: "config") pod "78cc33bd-e962-4121-8a5d-0e75ba60fdf3" (UID: "78cc33bd-e962-4121-8a5d-0e75ba60fdf3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.088624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e0705108-f020-43bc-a1af-7edae5a50927","Type":"ContainerStarted","Data":"f0112e661e47e20ef19a44e450ed3d76c809cd6c2ccded0507b6351eec466cad"} Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.088667 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e0705108-f020-43bc-a1af-7edae5a50927","Type":"ContainerStarted","Data":"d984510d4fd3fa39c044bbc7baf8be1f9033dbe04210ca783df64e4685010a74"} Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.092347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" event={"ID":"78cc33bd-e962-4121-8a5d-0e75ba60fdf3","Type":"ContainerDied","Data":"51de805b03b6b790488133042bda084ceb1673b8cd1e17e5c0711c710d6fed17"} Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.092387 4722 scope.go:117] "RemoveContainer" containerID="cddb49ca061ca66fd7aadb40dcd0c74ad46143b2e2ce35bcfc4f7f6eaebb9ac2" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.092510 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.112883 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.112865121 podStartE2EDuration="2.112865121s" podCreationTimestamp="2026-02-26 20:15:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:09.110110085 +0000 UTC m=+1251.647078019" watchObservedRunningTime="2026-02-26 20:15:09.112865121 +0000 UTC m=+1251.649833045" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.122808 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.154700 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-fdfqf"] Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.156762 4722 scope.go:117] "RemoveContainer" containerID="557863aadae5dfcfa5811e7da70cad25f46690ab2c721d603384f2b1764310bf" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.171806 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-fdfqf"] Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.765957 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5b495fbf79-442st"] Feb 26 20:15:09 crc kubenswrapper[4722]: E0226 20:15:09.766332 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerName="dnsmasq-dns" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.766351 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerName="dnsmasq-dns" Feb 26 20:15:09 crc kubenswrapper[4722]: E0226 20:15:09.767463 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerName="init" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.767483 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerName="init" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.767672 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerName="dnsmasq-dns" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.769107 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.771255 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.771421 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.771859 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.789179 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b495fbf79-442st"] Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-etc-swift\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838403 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-config-data\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838438 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-run-httpd\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838517 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvnkv\" (UniqueName: \"kubernetes.io/projected/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-kube-api-access-kvnkv\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838543 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-combined-ca-bundle\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-log-httpd\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838598 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-public-tls-certs\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-internal-tls-certs\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-combined-ca-bundle\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942233 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-log-httpd\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-public-tls-certs\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-internal-tls-certs\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-etc-swift\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942381 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-config-data\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-run-httpd\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942478 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvnkv\" (UniqueName: \"kubernetes.io/projected/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-kube-api-access-kvnkv\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.943411 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-run-httpd\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.943627 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-log-httpd\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.967972 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-config-data\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.970772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-combined-ca-bundle\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.975443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-public-tls-certs\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.975671 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-internal-tls-certs\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.979674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-etc-swift\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.986771 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvnkv\" (UniqueName: \"kubernetes.io/projected/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-kube-api-access-kvnkv\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:10 crc kubenswrapper[4722]: I0226 20:15:10.084892 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:10 crc kubenswrapper[4722]: I0226 20:15:10.180389 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" path="/var/lib/kubelet/pods/78cc33bd-e962-4121-8a5d-0e75ba60fdf3/volumes" Feb 26 20:15:10 crc kubenswrapper[4722]: I0226 20:15:10.744800 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.186:8776/healthcheck\": dial tcp 10.217.0.186:8776: connect: connection refused" Feb 26 20:15:10 crc kubenswrapper[4722]: I0226 20:15:10.909026 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b495fbf79-442st"] Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.142666 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b495fbf79-442st" event={"ID":"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17","Type":"ContainerStarted","Data":"6f909f68458c9d25d0ac67092bc10b947065f81f6b658f3e68b23b373f39ba9c"} Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.176430 4722 generic.go:334] "Generic (PLEG): container finished" podID="2667d371-c443-4205-90cd-420ef3d0b62d" containerID="7dd91e6baea5b819701667e8f65c7a6b6b3a6556bce4d0b818d931cbc05dbf34" exitCode=137 Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.176470 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2667d371-c443-4205-90cd-420ef3d0b62d","Type":"ContainerDied","Data":"7dd91e6baea5b819701667e8f65c7a6b6b3a6556bce4d0b818d931cbc05dbf34"} Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.637270 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.701452 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78hs8\" (UniqueName: \"kubernetes.io/projected/2667d371-c443-4205-90cd-420ef3d0b62d-kube-api-access-78hs8\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.701540 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data-custom\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.701572 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.701605 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-scripts\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.701645 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2667d371-c443-4205-90cd-420ef3d0b62d-etc-machine-id\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.702132 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2667d371-c443-4205-90cd-420ef3d0b62d-logs\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.702183 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-combined-ca-bundle\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.705248 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2667d371-c443-4205-90cd-420ef3d0b62d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.706525 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2667d371-c443-4205-90cd-420ef3d0b62d-logs" (OuterVolumeSpecName: "logs") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.712049 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-scripts" (OuterVolumeSpecName: "scripts") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.717284 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.721766 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2667d371-c443-4205-90cd-420ef3d0b62d-kube-api-access-78hs8" (OuterVolumeSpecName: "kube-api-access-78hs8") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "kube-api-access-78hs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.760274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.796252 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data" (OuterVolumeSpecName: "config-data") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804339 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804364 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804373 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2667d371-c443-4205-90cd-420ef3d0b62d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804384 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2667d371-c443-4205-90cd-420ef3d0b62d-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804392 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804401 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78hs8\" (UniqueName: \"kubernetes.io/projected/2667d371-c443-4205-90cd-420ef3d0b62d-kube-api-access-78hs8\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804409 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.197995 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b495fbf79-442st" event={"ID":"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17","Type":"ContainerStarted","Data":"f4be6124b1b719cbd4cf8e5f2f853baf6e4c476d26bdd072a68cae4581ce00cc"} Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.198057 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b495fbf79-442st" event={"ID":"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17","Type":"ContainerStarted","Data":"94d983920761b3d30fe3e32f4a9c8fb362a10bf921345e74d15b9963b2c5543f"} Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.198331 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.198368 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.219374 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2667d371-c443-4205-90cd-420ef3d0b62d","Type":"ContainerDied","Data":"7cb1f32eaa85ba614f78146f41f866dca1258348fa2a0d73dce20b9e35fed675"} Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.219454 4722 scope.go:117] "RemoveContainer" containerID="7dd91e6baea5b819701667e8f65c7a6b6b3a6556bce4d0b818d931cbc05dbf34" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.219571 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.223520 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.240346 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5b495fbf79-442st" podStartSLOduration=3.240324696 podStartE2EDuration="3.240324696s" podCreationTimestamp="2026-02-26 20:15:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:12.238389083 +0000 UTC m=+1254.775357007" watchObservedRunningTime="2026-02-26 20:15:12.240324696 +0000 UTC m=+1254.777292640" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.294705 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.323241 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.348214 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b7cfb9b54-qvhbm"] Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.348493 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7b7cfb9b54-qvhbm" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-api" containerID="cri-o://3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3" gracePeriod=30 Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.348932 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7b7cfb9b54-qvhbm" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-httpd" containerID="cri-o://bca49d1b838a18d1e10c67c19f2c179615b47cb6748f17ba06e8df22c0228995" gracePeriod=30 Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.364500 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:15:12 crc kubenswrapper[4722]: E0226 20:15:12.365003 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api-log" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.365024 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api-log" Feb 26 20:15:12 crc kubenswrapper[4722]: E0226 20:15:12.365064 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.365070 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.365294 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.365323 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api-log" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.366413 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.374091 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.374325 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.375495 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.385294 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433306 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn7lp\" (UniqueName: \"kubernetes.io/projected/2805299d-4ab4-420c-aa59-bc54594053d5-kube-api-access-kn7lp\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433419 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2805299d-4ab4-420c-aa59-bc54594053d5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433446 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-scripts\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2805299d-4ab4-420c-aa59-bc54594053d5-logs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433486 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-config-data\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433506 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-config-data-custom\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433525 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433604 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535568 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn7lp\" (UniqueName: \"kubernetes.io/projected/2805299d-4ab4-420c-aa59-bc54594053d5-kube-api-access-kn7lp\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2805299d-4ab4-420c-aa59-bc54594053d5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-scripts\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2805299d-4ab4-420c-aa59-bc54594053d5-logs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535664 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-config-data\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535700 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-config-data-custom\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535719 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535794 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535802 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2805299d-4ab4-420c-aa59-bc54594053d5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.536345 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2805299d-4ab4-420c-aa59-bc54594053d5-logs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.541912 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-scripts\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.543382 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.543727 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-config-data-custom\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.546788 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.547636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-config-data\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.548746 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.554849 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn7lp\" (UniqueName: \"kubernetes.io/projected/2805299d-4ab4-420c-aa59-bc54594053d5-kube-api-access-kn7lp\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.700548 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 20:15:13 crc kubenswrapper[4722]: I0226 20:15:13.269941 4722 generic.go:334] "Generic (PLEG): container finished" podID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerID="bca49d1b838a18d1e10c67c19f2c179615b47cb6748f17ba06e8df22c0228995" exitCode=0 Feb 26 20:15:13 crc kubenswrapper[4722]: I0226 20:15:13.271075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cfb9b54-qvhbm" event={"ID":"7810fb24-84d9-45c8-9456-7d1a6c6c8fff","Type":"ContainerDied","Data":"bca49d1b838a18d1e10c67c19f2c179615b47cb6748f17ba06e8df22c0228995"} Feb 26 20:15:14 crc kubenswrapper[4722]: I0226 20:15:14.158244 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" path="/var/lib/kubelet/pods/2667d371-c443-4205-90cd-420ef3d0b62d/volumes" Feb 26 20:15:18 crc kubenswrapper[4722]: I0226 20:15:18.328633 4722 generic.go:334] "Generic (PLEG): container finished" podID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerID="3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3" exitCode=0 Feb 26 20:15:18 crc kubenswrapper[4722]: I0226 20:15:18.328741 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cfb9b54-qvhbm" event={"ID":"7810fb24-84d9-45c8-9456-7d1a6c6c8fff","Type":"ContainerDied","Data":"3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3"} Feb 26 20:15:18 crc kubenswrapper[4722]: I0226 20:15:18.743758 4722 scope.go:117] "RemoveContainer" containerID="ccdd14614f54d6a4870da57abe79788458e860727742682760868c41346dc0bb" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.251759 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.308774 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n56pn\" (UniqueName: \"kubernetes.io/projected/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-kube-api-access-n56pn\") pod \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.308880 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-config\") pod \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.308939 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-ovndb-tls-certs\") pod \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.309026 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-httpd-config\") pod \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.309073 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-combined-ca-bundle\") pod \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.314610 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7810fb24-84d9-45c8-9456-7d1a6c6c8fff" (UID: "7810fb24-84d9-45c8-9456-7d1a6c6c8fff"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.318543 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-kube-api-access-n56pn" (OuterVolumeSpecName: "kube-api-access-n56pn") pod "7810fb24-84d9-45c8-9456-7d1a6c6c8fff" (UID: "7810fb24-84d9-45c8-9456-7d1a6c6c8fff"). InnerVolumeSpecName "kube-api-access-n56pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.341616 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cfb9b54-qvhbm" event={"ID":"7810fb24-84d9-45c8-9456-7d1a6c6c8fff","Type":"ContainerDied","Data":"13fdf3fbbd44bcdef851fc4937da95414f9511f1d58caad15216959bbf0ce9d4"} Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.341710 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.342345 4722 scope.go:117] "RemoveContainer" containerID="bca49d1b838a18d1e10c67c19f2c179615b47cb6748f17ba06e8df22c0228995" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.344287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d","Type":"ContainerStarted","Data":"e0293e82d5fbb156e242ab098696c2279affdaa6da4a1deb98601e0a77f48f2b"} Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.372599 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.34671563 podStartE2EDuration="19.372579294s" podCreationTimestamp="2026-02-26 20:15:00 +0000 UTC" firstStartedPulling="2026-02-26 20:15:01.755774546 +0000 UTC m=+1244.292742470" lastFinishedPulling="2026-02-26 20:15:18.78163821 +0000 UTC m=+1261.318606134" observedRunningTime="2026-02-26 20:15:19.36107792 +0000 UTC m=+1261.898045874" watchObservedRunningTime="2026-02-26 20:15:19.372579294 +0000 UTC m=+1261.909547228" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.390399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-config" (OuterVolumeSpecName: "config") pod "7810fb24-84d9-45c8-9456-7d1a6c6c8fff" (UID: "7810fb24-84d9-45c8-9456-7d1a6c6c8fff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.395760 4722 scope.go:117] "RemoveContainer" containerID="3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.398309 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.404187 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7810fb24-84d9-45c8-9456-7d1a6c6c8fff" (UID: "7810fb24-84d9-45c8-9456-7d1a6c6c8fff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.410986 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n56pn\" (UniqueName: \"kubernetes.io/projected/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-kube-api-access-n56pn\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.411018 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.411027 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.411035 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.428298 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7810fb24-84d9-45c8-9456-7d1a6c6c8fff" (UID: "7810fb24-84d9-45c8-9456-7d1a6c6c8fff"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.512657 4722 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.721075 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b7cfb9b54-qvhbm"] Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.733425 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7b7cfb9b54-qvhbm"] Feb 26 20:15:20 crc kubenswrapper[4722]: I0226 20:15:20.107200 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:20 crc kubenswrapper[4722]: I0226 20:15:20.107608 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:20 crc kubenswrapper[4722]: I0226 20:15:20.160937 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" path="/var/lib/kubelet/pods/7810fb24-84d9-45c8-9456-7d1a6c6c8fff/volumes" Feb 26 20:15:20 crc kubenswrapper[4722]: I0226 20:15:20.367986 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2805299d-4ab4-420c-aa59-bc54594053d5","Type":"ContainerStarted","Data":"6f6c934c8dac488639dfc4aef782df518697efa1e20c828386a67a4ff1c2d76b"} Feb 26 20:15:20 crc kubenswrapper[4722]: I0226 20:15:20.368496 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2805299d-4ab4-420c-aa59-bc54594053d5","Type":"ContainerStarted","Data":"55e904e50f9f93a426ad9fbc7a389521998a647a5b6b23e396bf7296bc411d4c"} Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.116176 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.248747 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-scripts\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.248879 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-combined-ca-bundle\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.248906 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-sg-core-conf-yaml\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.249027 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwwzp\" (UniqueName: \"kubernetes.io/projected/834d875f-efb0-42d3-8aad-fd7a7209cbeb-kube-api-access-kwwzp\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.249055 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-log-httpd\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.249092 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-config-data\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.249123 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-run-httpd\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.250051 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.250117 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.268291 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-scripts" (OuterVolumeSpecName: "scripts") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.277288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/834d875f-efb0-42d3-8aad-fd7a7209cbeb-kube-api-access-kwwzp" (OuterVolumeSpecName: "kube-api-access-kwwzp") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "kube-api-access-kwwzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.293335 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.331461 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.351443 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.351479 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.351490 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.351500 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwwzp\" (UniqueName: \"kubernetes.io/projected/834d875f-efb0-42d3-8aad-fd7a7209cbeb-kube-api-access-kwwzp\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.351511 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.351521 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.381822 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2805299d-4ab4-420c-aa59-bc54594053d5","Type":"ContainerStarted","Data":"3269d50604b73c7eb0380879280145fb85b313dd3f750def92a1816536f46b13"} Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.382832 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.385159 4722 generic.go:334] "Generic (PLEG): container finished" podID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerID="209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852" exitCode=137 Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.385260 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerDied","Data":"209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852"} Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.385361 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerDied","Data":"db737bb35890c1c6ada44a53fbe5b35f5ec6b4917823fc3fd7aa46e8919c0258"} Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.385446 4722 scope.go:117] "RemoveContainer" containerID="209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.385499 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.388734 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-config-data" (OuterVolumeSpecName: "config-data") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.406379 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=9.406358885 podStartE2EDuration="9.406358885s" podCreationTimestamp="2026-02-26 20:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:21.400058282 +0000 UTC m=+1263.937026206" watchObservedRunningTime="2026-02-26 20:15:21.406358885 +0000 UTC m=+1263.943326819" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.410424 4722 scope.go:117] "RemoveContainer" containerID="10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.431621 4722 scope.go:117] "RemoveContainer" containerID="30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.450701 4722 scope.go:117] "RemoveContainer" containerID="e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.454519 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.472108 4722 scope.go:117] "RemoveContainer" containerID="209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.472626 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852\": container with ID starting with 209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852 not found: ID does not exist" containerID="209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.472680 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852"} err="failed to get container status \"209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852\": rpc error: code = NotFound desc = could not find container \"209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852\": container with ID starting with 209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852 not found: ID does not exist" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.472710 4722 scope.go:117] "RemoveContainer" containerID="10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.473160 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315\": container with ID starting with 10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315 not found: ID does not exist" containerID="10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.473309 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315"} err="failed to get container status \"10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315\": rpc error: code = NotFound desc = could not find container \"10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315\": container with ID starting with 10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315 not found: ID does not exist" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.473391 4722 scope.go:117] "RemoveContainer" containerID="30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.473713 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0\": container with ID starting with 30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0 not found: ID does not exist" containerID="30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.473744 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0"} err="failed to get container status \"30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0\": rpc error: code = NotFound desc = could not find container \"30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0\": container with ID starting with 30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0 not found: ID does not exist" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.473762 4722 scope.go:117] "RemoveContainer" containerID="e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.473995 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d\": container with ID starting with e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d not found: ID does not exist" containerID="e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.474026 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d"} err="failed to get container status \"e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d\": rpc error: code = NotFound desc = could not find container \"e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d\": container with ID starting with e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d not found: ID does not exist" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.750323 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.760675 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777072 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.777497 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-central-agent" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777514 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-central-agent" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.777541 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-notification-agent" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777547 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-notification-agent" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.777557 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="proxy-httpd" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777564 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="proxy-httpd" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.777574 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-api" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777580 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-api" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.777596 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="sg-core" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777602 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="sg-core" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.777613 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-httpd" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777618 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-httpd" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777810 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-notification-agent" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777819 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-api" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777833 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="sg-core" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777842 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-httpd" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777853 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="proxy-httpd" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777862 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-central-agent" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.779571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.782198 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.785806 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.795618 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.861381 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-scripts\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.861444 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.861569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.861753 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvzlj\" (UniqueName: \"kubernetes.io/projected/6e123c48-da1a-45ec-900b-d09057a529d7-kube-api-access-wvzlj\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.861860 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-run-httpd\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.861920 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-log-httpd\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.862109 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-config-data\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964398 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-config-data\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964463 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-scripts\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964519 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964579 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvzlj\" (UniqueName: \"kubernetes.io/projected/6e123c48-da1a-45ec-900b-d09057a529d7-kube-api-access-wvzlj\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964616 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-run-httpd\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-log-httpd\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.965040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-log-httpd\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.965270 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-run-httpd\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.973837 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.974226 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-config-data\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.974406 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.974803 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-scripts\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.001940 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvzlj\" (UniqueName: \"kubernetes.io/projected/6e123c48-da1a-45ec-900b-d09057a529d7-kube-api-access-wvzlj\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.149732 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.159688 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" path="/var/lib/kubelet/pods/834d875f-efb0-42d3-8aad-fd7a7209cbeb/volumes" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.194454 4722 scope.go:117] "RemoveContainer" containerID="8c622b469f8138308a9cbdc0290940b2c8c2133097793fb4b0c20d724843c278" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.368836 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hlxtf"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.370606 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.416214 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hlxtf"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.474712 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5cc671-e3c0-4b89-a2db-be576bf17d80-operator-scripts\") pod \"nova-api-db-create-hlxtf\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.475301 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhmxs\" (UniqueName: \"kubernetes.io/projected/fe5cc671-e3c0-4b89-a2db-be576bf17d80-kube-api-access-hhmxs\") pod \"nova-api-db-create-hlxtf\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.496275 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-fm2w6"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.497917 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.518370 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fm2w6"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.577199 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm2bh\" (UniqueName: \"kubernetes.io/projected/37b676a2-eba1-45dd-accd-84f2c1d0eba6-kube-api-access-hm2bh\") pod \"nova-cell0-db-create-fm2w6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.577372 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhmxs\" (UniqueName: \"kubernetes.io/projected/fe5cc671-e3c0-4b89-a2db-be576bf17d80-kube-api-access-hhmxs\") pod \"nova-api-db-create-hlxtf\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.577734 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5cc671-e3c0-4b89-a2db-be576bf17d80-operator-scripts\") pod \"nova-api-db-create-hlxtf\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.578546 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5cc671-e3c0-4b89-a2db-be576bf17d80-operator-scripts\") pod \"nova-api-db-create-hlxtf\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.578599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b676a2-eba1-45dd-accd-84f2c1d0eba6-operator-scripts\") pod \"nova-cell0-db-create-fm2w6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.579223 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-051f-account-create-update-5jdk4"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.581263 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.586886 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.593734 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-ndnrb"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.594119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhmxs\" (UniqueName: \"kubernetes.io/projected/fe5cc671-e3c0-4b89-a2db-be576bf17d80-kube-api-access-hhmxs\") pod \"nova-api-db-create-hlxtf\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.595250 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.602382 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ndnrb"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.618191 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-051f-account-create-update-5jdk4"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.681188 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b676a2-eba1-45dd-accd-84f2c1d0eba6-operator-scripts\") pod \"nova-cell0-db-create-fm2w6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.681270 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2xxj\" (UniqueName: \"kubernetes.io/projected/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-kube-api-access-d2xxj\") pod \"nova-api-051f-account-create-update-5jdk4\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.681417 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpjh9\" (UniqueName: \"kubernetes.io/projected/ac8f5041-719a-463a-be2b-58da5280e1b9-kube-api-access-bpjh9\") pod \"nova-cell1-db-create-ndnrb\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.681468 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8f5041-719a-463a-be2b-58da5280e1b9-operator-scripts\") pod \"nova-cell1-db-create-ndnrb\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.681498 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm2bh\" (UniqueName: \"kubernetes.io/projected/37b676a2-eba1-45dd-accd-84f2c1d0eba6-kube-api-access-hm2bh\") pod \"nova-cell0-db-create-fm2w6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.681560 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-operator-scripts\") pod \"nova-api-051f-account-create-update-5jdk4\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.683572 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b676a2-eba1-45dd-accd-84f2c1d0eba6-operator-scripts\") pod \"nova-cell0-db-create-fm2w6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.700726 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm2bh\" (UniqueName: \"kubernetes.io/projected/37b676a2-eba1-45dd-accd-84f2c1d0eba6-kube-api-access-hm2bh\") pod \"nova-cell0-db-create-fm2w6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.707654 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.741211 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:22 crc kubenswrapper[4722]: W0226 20:15:22.767940 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e123c48_da1a_45ec_900b_d09057a529d7.slice/crio-963e930354e3ad7445a306d38610b15c15135eebb76c4a06a846c9bbe3a110f3 WatchSource:0}: Error finding container 963e930354e3ad7445a306d38610b15c15135eebb76c4a06a846c9bbe3a110f3: Status 404 returned error can't find the container with id 963e930354e3ad7445a306d38610b15c15135eebb76c4a06a846c9bbe3a110f3 Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.773529 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1fe4-account-create-update-fch9q"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.775266 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.777461 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.784151 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2xxj\" (UniqueName: \"kubernetes.io/projected/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-kube-api-access-d2xxj\") pod \"nova-api-051f-account-create-update-5jdk4\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.784292 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpjh9\" (UniqueName: \"kubernetes.io/projected/ac8f5041-719a-463a-be2b-58da5280e1b9-kube-api-access-bpjh9\") pod \"nova-cell1-db-create-ndnrb\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.784382 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8f5041-719a-463a-be2b-58da5280e1b9-operator-scripts\") pod \"nova-cell1-db-create-ndnrb\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.784495 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-operator-scripts\") pod \"nova-api-051f-account-create-update-5jdk4\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.785278 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-operator-scripts\") pod \"nova-api-051f-account-create-update-5jdk4\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.786195 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8f5041-719a-463a-be2b-58da5280e1b9-operator-scripts\") pod \"nova-cell1-db-create-ndnrb\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.796574 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1fe4-account-create-update-fch9q"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.806863 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2xxj\" (UniqueName: \"kubernetes.io/projected/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-kube-api-access-d2xxj\") pod \"nova-api-051f-account-create-update-5jdk4\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.815682 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpjh9\" (UniqueName: \"kubernetes.io/projected/ac8f5041-719a-463a-be2b-58da5280e1b9-kube-api-access-bpjh9\") pod \"nova-cell1-db-create-ndnrb\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.826322 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.887057 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef0d022-c81c-489e-91aa-209be0812ce0-operator-scripts\") pod \"nova-cell0-1fe4-account-create-update-fch9q\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.887190 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tg5f\" (UniqueName: \"kubernetes.io/projected/9ef0d022-c81c-489e-91aa-209be0812ce0-kube-api-access-6tg5f\") pod \"nova-cell0-1fe4-account-create-update-fch9q\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.912851 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.935239 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.986890 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8b92-account-create-update-hxkpb"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.988327 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.990398 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef0d022-c81c-489e-91aa-209be0812ce0-operator-scripts\") pod \"nova-cell0-1fe4-account-create-update-fch9q\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.990567 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tg5f\" (UniqueName: \"kubernetes.io/projected/9ef0d022-c81c-489e-91aa-209be0812ce0-kube-api-access-6tg5f\") pod \"nova-cell0-1fe4-account-create-update-fch9q\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.991561 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef0d022-c81c-489e-91aa-209be0812ce0-operator-scripts\") pod \"nova-cell0-1fe4-account-create-update-fch9q\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.003179 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.008868 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8b92-account-create-update-hxkpb"] Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.009993 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tg5f\" (UniqueName: \"kubernetes.io/projected/9ef0d022-c81c-489e-91aa-209be0812ce0-kube-api-access-6tg5f\") pod \"nova-cell0-1fe4-account-create-update-fch9q\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.092838 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af30249f-96fd-4efc-a9f1-9d571dc0e104-operator-scripts\") pod \"nova-cell1-8b92-account-create-update-hxkpb\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.093056 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rflpg\" (UniqueName: \"kubernetes.io/projected/af30249f-96fd-4efc-a9f1-9d571dc0e104-kube-api-access-rflpg\") pod \"nova-cell1-8b92-account-create-update-hxkpb\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.194418 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rflpg\" (UniqueName: \"kubernetes.io/projected/af30249f-96fd-4efc-a9f1-9d571dc0e104-kube-api-access-rflpg\") pod \"nova-cell1-8b92-account-create-update-hxkpb\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.194520 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af30249f-96fd-4efc-a9f1-9d571dc0e104-operator-scripts\") pod \"nova-cell1-8b92-account-create-update-hxkpb\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.196072 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af30249f-96fd-4efc-a9f1-9d571dc0e104-operator-scripts\") pod \"nova-cell1-8b92-account-create-update-hxkpb\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.209321 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.216124 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rflpg\" (UniqueName: \"kubernetes.io/projected/af30249f-96fd-4efc-a9f1-9d571dc0e104-kube-api-access-rflpg\") pod \"nova-cell1-8b92-account-create-update-hxkpb\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.319321 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.329146 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hlxtf"] Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.453866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hlxtf" event={"ID":"fe5cc671-e3c0-4b89-a2db-be576bf17d80","Type":"ContainerStarted","Data":"0ecd87906cfd209839edc1b8a8d87299c7f82a53193ace68a5a3eb0ff19a212b"} Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.455088 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerStarted","Data":"963e930354e3ad7445a306d38610b15c15135eebb76c4a06a846c9bbe3a110f3"} Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.485645 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fm2w6"] Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.614263 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-051f-account-create-update-5jdk4"] Feb 26 20:15:23 crc kubenswrapper[4722]: W0226 20:15:23.619958 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1ecfe90_9cf6_4ec4_aaa6_295d71d4daac.slice/crio-3f6df67b10391d2a6955d8c9078f735608fc34b561fc8809107c8e141cb2c484 WatchSource:0}: Error finding container 3f6df67b10391d2a6955d8c9078f735608fc34b561fc8809107c8e141cb2c484: Status 404 returned error can't find the container with id 3f6df67b10391d2a6955d8c9078f735608fc34b561fc8809107c8e141cb2c484 Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.628648 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ndnrb"] Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.835279 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1fe4-account-create-update-fch9q"] Feb 26 20:15:24 crc kubenswrapper[4722]: W0226 20:15:24.050609 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf30249f_96fd_4efc_a9f1_9d571dc0e104.slice/crio-e3fef180dc6868b8156b53906ddc4a4a578a42d80079310305593be4ccb4ffba WatchSource:0}: Error finding container e3fef180dc6868b8156b53906ddc4a4a578a42d80079310305593be4ccb4ffba: Status 404 returned error can't find the container with id e3fef180dc6868b8156b53906ddc4a4a578a42d80079310305593be4ccb4ffba Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.057644 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8b92-account-create-update-hxkpb"] Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.466480 4722 generic.go:334] "Generic (PLEG): container finished" podID="37b676a2-eba1-45dd-accd-84f2c1d0eba6" containerID="0a5a814b45dd1516dc3cbde82fadf29bbfb0668d97c930f4ecbd4108971b772a" exitCode=0 Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.466848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fm2w6" event={"ID":"37b676a2-eba1-45dd-accd-84f2c1d0eba6","Type":"ContainerDied","Data":"0a5a814b45dd1516dc3cbde82fadf29bbfb0668d97c930f4ecbd4108971b772a"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.466878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fm2w6" event={"ID":"37b676a2-eba1-45dd-accd-84f2c1d0eba6","Type":"ContainerStarted","Data":"4b5a6eb4e75a983fb1d42b52281d79842ea391921e54cf51affbcc9781cbc1b6"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.469451 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ndnrb" event={"ID":"ac8f5041-719a-463a-be2b-58da5280e1b9","Type":"ContainerStarted","Data":"623461d24044b6490c318555def5090a02940373d46d385c0200955da356d6ee"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.469492 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ndnrb" event={"ID":"ac8f5041-719a-463a-be2b-58da5280e1b9","Type":"ContainerStarted","Data":"a84f26797850c192da0205a54bbc45bdb89bbbad9b5a88f1b2703d4b978b6a3d"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.472290 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" event={"ID":"af30249f-96fd-4efc-a9f1-9d571dc0e104","Type":"ContainerStarted","Data":"49960b919d29d8cfc6fb95130f19cf2558ac6230e20d1ce56374f8bd1a80ccca"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.472317 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" event={"ID":"af30249f-96fd-4efc-a9f1-9d571dc0e104","Type":"ContainerStarted","Data":"e3fef180dc6868b8156b53906ddc4a4a578a42d80079310305593be4ccb4ffba"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.477268 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerStarted","Data":"a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.477304 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerStarted","Data":"1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.478779 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe5cc671-e3c0-4b89-a2db-be576bf17d80" containerID="c029f28011800ff3d69c1f127442300f8dcdfd75b3e8d05cecb50a22759ad803" exitCode=0 Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.478819 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hlxtf" event={"ID":"fe5cc671-e3c0-4b89-a2db-be576bf17d80","Type":"ContainerDied","Data":"c029f28011800ff3d69c1f127442300f8dcdfd75b3e8d05cecb50a22759ad803"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.480194 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-051f-account-create-update-5jdk4" event={"ID":"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac","Type":"ContainerStarted","Data":"30054e203c57524d8b5cff442429e6ee7df49e239a7b95844ec3c000b889b494"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.480219 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-051f-account-create-update-5jdk4" event={"ID":"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac","Type":"ContainerStarted","Data":"3f6df67b10391d2a6955d8c9078f735608fc34b561fc8809107c8e141cb2c484"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.486032 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" event={"ID":"9ef0d022-c81c-489e-91aa-209be0812ce0","Type":"ContainerStarted","Data":"d828df4164de6ac089e32225dc26397da48a4df66dd12f3a5de850c019258968"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.486084 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" event={"ID":"9ef0d022-c81c-489e-91aa-209be0812ce0","Type":"ContainerStarted","Data":"b11d55d058787653793b35706d5c0779f376a191a4f7f6f6ae19fb3d967962ca"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.509891 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-ndnrb" podStartSLOduration=2.509872177 podStartE2EDuration="2.509872177s" podCreationTimestamp="2026-02-26 20:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:24.499455483 +0000 UTC m=+1267.036423407" watchObservedRunningTime="2026-02-26 20:15:24.509872177 +0000 UTC m=+1267.046840111" Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.538039 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" podStartSLOduration=2.5380150439999998 podStartE2EDuration="2.538015044s" podCreationTimestamp="2026-02-26 20:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:24.527488547 +0000 UTC m=+1267.064456471" watchObservedRunningTime="2026-02-26 20:15:24.538015044 +0000 UTC m=+1267.074982978" Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.566635 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-051f-account-create-update-5jdk4" podStartSLOduration=2.566618993 podStartE2EDuration="2.566618993s" podCreationTimestamp="2026-02-26 20:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:24.558469281 +0000 UTC m=+1267.095437205" watchObservedRunningTime="2026-02-26 20:15:24.566618993 +0000 UTC m=+1267.103586917" Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.622722 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" podStartSLOduration=2.622699472 podStartE2EDuration="2.622699472s" podCreationTimestamp="2026-02-26 20:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:24.616169953 +0000 UTC m=+1267.153137887" watchObservedRunningTime="2026-02-26 20:15:24.622699472 +0000 UTC m=+1267.159667396" Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.496237 4722 generic.go:334] "Generic (PLEG): container finished" podID="ac8f5041-719a-463a-be2b-58da5280e1b9" containerID="623461d24044b6490c318555def5090a02940373d46d385c0200955da356d6ee" exitCode=0 Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.496355 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ndnrb" event={"ID":"ac8f5041-719a-463a-be2b-58da5280e1b9","Type":"ContainerDied","Data":"623461d24044b6490c318555def5090a02940373d46d385c0200955da356d6ee"} Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.498813 4722 generic.go:334] "Generic (PLEG): container finished" podID="af30249f-96fd-4efc-a9f1-9d571dc0e104" containerID="49960b919d29d8cfc6fb95130f19cf2558ac6230e20d1ce56374f8bd1a80ccca" exitCode=0 Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.498854 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" event={"ID":"af30249f-96fd-4efc-a9f1-9d571dc0e104","Type":"ContainerDied","Data":"49960b919d29d8cfc6fb95130f19cf2558ac6230e20d1ce56374f8bd1a80ccca"} Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.500946 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerStarted","Data":"0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df"} Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.502542 4722 generic.go:334] "Generic (PLEG): container finished" podID="e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" containerID="30054e203c57524d8b5cff442429e6ee7df49e239a7b95844ec3c000b889b494" exitCode=0 Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.502615 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-051f-account-create-update-5jdk4" event={"ID":"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac","Type":"ContainerDied","Data":"30054e203c57524d8b5cff442429e6ee7df49e239a7b95844ec3c000b889b494"} Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.504076 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ef0d022-c81c-489e-91aa-209be0812ce0" containerID="d828df4164de6ac089e32225dc26397da48a4df66dd12f3a5de850c019258968" exitCode=0 Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.504341 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" event={"ID":"9ef0d022-c81c-489e-91aa-209be0812ce0","Type":"ContainerDied","Data":"d828df4164de6ac089e32225dc26397da48a4df66dd12f3a5de850c019258968"} Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.112511 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.119104 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.168675 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5cc671-e3c0-4b89-a2db-be576bf17d80-operator-scripts\") pod \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.168747 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhmxs\" (UniqueName: \"kubernetes.io/projected/fe5cc671-e3c0-4b89-a2db-be576bf17d80-kube-api-access-hhmxs\") pod \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.168816 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm2bh\" (UniqueName: \"kubernetes.io/projected/37b676a2-eba1-45dd-accd-84f2c1d0eba6-kube-api-access-hm2bh\") pod \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.168894 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b676a2-eba1-45dd-accd-84f2c1d0eba6-operator-scripts\") pod \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.171584 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b676a2-eba1-45dd-accd-84f2c1d0eba6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37b676a2-eba1-45dd-accd-84f2c1d0eba6" (UID: "37b676a2-eba1-45dd-accd-84f2c1d0eba6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.172374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe5cc671-e3c0-4b89-a2db-be576bf17d80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe5cc671-e3c0-4b89-a2db-be576bf17d80" (UID: "fe5cc671-e3c0-4b89-a2db-be576bf17d80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.174893 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe5cc671-e3c0-4b89-a2db-be576bf17d80-kube-api-access-hhmxs" (OuterVolumeSpecName: "kube-api-access-hhmxs") pod "fe5cc671-e3c0-4b89-a2db-be576bf17d80" (UID: "fe5cc671-e3c0-4b89-a2db-be576bf17d80"). InnerVolumeSpecName "kube-api-access-hhmxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.177007 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b676a2-eba1-45dd-accd-84f2c1d0eba6-kube-api-access-hm2bh" (OuterVolumeSpecName: "kube-api-access-hm2bh") pod "37b676a2-eba1-45dd-accd-84f2c1d0eba6" (UID: "37b676a2-eba1-45dd-accd-84f2c1d0eba6"). InnerVolumeSpecName "kube-api-access-hm2bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.271959 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5cc671-e3c0-4b89-a2db-be576bf17d80-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.271998 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhmxs\" (UniqueName: \"kubernetes.io/projected/fe5cc671-e3c0-4b89-a2db-be576bf17d80-kube-api-access-hhmxs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.272012 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm2bh\" (UniqueName: \"kubernetes.io/projected/37b676a2-eba1-45dd-accd-84f2c1d0eba6-kube-api-access-hm2bh\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.272026 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b676a2-eba1-45dd-accd-84f2c1d0eba6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:26 crc kubenswrapper[4722]: E0226 20:15:26.525650 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810fb24_84d9_45c8_9456_7d1a6c6c8fff.slice/crio-3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.544177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hlxtf" event={"ID":"fe5cc671-e3c0-4b89-a2db-be576bf17d80","Type":"ContainerDied","Data":"0ecd87906cfd209839edc1b8a8d87299c7f82a53193ace68a5a3eb0ff19a212b"} Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.544304 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ecd87906cfd209839edc1b8a8d87299c7f82a53193ace68a5a3eb0ff19a212b" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.544408 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.546644 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.546793 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fm2w6" event={"ID":"37b676a2-eba1-45dd-accd-84f2c1d0eba6","Type":"ContainerDied","Data":"4b5a6eb4e75a983fb1d42b52281d79842ea391921e54cf51affbcc9781cbc1b6"} Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.546822 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b5a6eb4e75a983fb1d42b52281d79842ea391921e54cf51affbcc9781cbc1b6" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.893019 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.996005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef0d022-c81c-489e-91aa-209be0812ce0-operator-scripts\") pod \"9ef0d022-c81c-489e-91aa-209be0812ce0\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.996205 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tg5f\" (UniqueName: \"kubernetes.io/projected/9ef0d022-c81c-489e-91aa-209be0812ce0-kube-api-access-6tg5f\") pod \"9ef0d022-c81c-489e-91aa-209be0812ce0\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.997519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ef0d022-c81c-489e-91aa-209be0812ce0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ef0d022-c81c-489e-91aa-209be0812ce0" (UID: "9ef0d022-c81c-489e-91aa-209be0812ce0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.006128 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef0d022-c81c-489e-91aa-209be0812ce0-kube-api-access-6tg5f" (OuterVolumeSpecName: "kube-api-access-6tg5f") pod "9ef0d022-c81c-489e-91aa-209be0812ce0" (UID: "9ef0d022-c81c-489e-91aa-209be0812ce0"). InnerVolumeSpecName "kube-api-access-6tg5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.098905 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tg5f\" (UniqueName: \"kubernetes.io/projected/9ef0d022-c81c-489e-91aa-209be0812ce0-kube-api-access-6tg5f\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.098937 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef0d022-c81c-489e-91aa-209be0812ce0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.299195 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.305019 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.310210 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.402610 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.402868 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-log" containerID="cri-o://e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf" gracePeriod=30 Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403154 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-httpd" containerID="cri-o://7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e" gracePeriod=30 Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403194 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpjh9\" (UniqueName: \"kubernetes.io/projected/ac8f5041-719a-463a-be2b-58da5280e1b9-kube-api-access-bpjh9\") pod \"ac8f5041-719a-463a-be2b-58da5280e1b9\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403327 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af30249f-96fd-4efc-a9f1-9d571dc0e104-operator-scripts\") pod \"af30249f-96fd-4efc-a9f1-9d571dc0e104\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403475 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-operator-scripts\") pod \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403510 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2xxj\" (UniqueName: \"kubernetes.io/projected/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-kube-api-access-d2xxj\") pod \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403585 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rflpg\" (UniqueName: \"kubernetes.io/projected/af30249f-96fd-4efc-a9f1-9d571dc0e104-kube-api-access-rflpg\") pod \"af30249f-96fd-4efc-a9f1-9d571dc0e104\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403627 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8f5041-719a-463a-be2b-58da5280e1b9-operator-scripts\") pod \"ac8f5041-719a-463a-be2b-58da5280e1b9\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.404892 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8f5041-719a-463a-be2b-58da5280e1b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac8f5041-719a-463a-be2b-58da5280e1b9" (UID: "ac8f5041-719a-463a-be2b-58da5280e1b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.405527 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" (UID: "e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.405813 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af30249f-96fd-4efc-a9f1-9d571dc0e104-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af30249f-96fd-4efc-a9f1-9d571dc0e104" (UID: "af30249f-96fd-4efc-a9f1-9d571dc0e104"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.436348 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-kube-api-access-d2xxj" (OuterVolumeSpecName: "kube-api-access-d2xxj") pod "e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" (UID: "e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac"). InnerVolumeSpecName "kube-api-access-d2xxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.438870 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af30249f-96fd-4efc-a9f1-9d571dc0e104-kube-api-access-rflpg" (OuterVolumeSpecName: "kube-api-access-rflpg") pod "af30249f-96fd-4efc-a9f1-9d571dc0e104" (UID: "af30249f-96fd-4efc-a9f1-9d571dc0e104"). InnerVolumeSpecName "kube-api-access-rflpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.451643 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8f5041-719a-463a-be2b-58da5280e1b9-kube-api-access-bpjh9" (OuterVolumeSpecName: "kube-api-access-bpjh9") pod "ac8f5041-719a-463a-be2b-58da5280e1b9" (UID: "ac8f5041-719a-463a-be2b-58da5280e1b9"). InnerVolumeSpecName "kube-api-access-bpjh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.505637 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpjh9\" (UniqueName: \"kubernetes.io/projected/ac8f5041-719a-463a-be2b-58da5280e1b9-kube-api-access-bpjh9\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.505676 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af30249f-96fd-4efc-a9f1-9d571dc0e104-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.505687 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.505695 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2xxj\" (UniqueName: \"kubernetes.io/projected/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-kube-api-access-d2xxj\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.505704 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rflpg\" (UniqueName: \"kubernetes.io/projected/af30249f-96fd-4efc-a9f1-9d571dc0e104-kube-api-access-rflpg\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.505712 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8f5041-719a-463a-be2b-58da5280e1b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.529567 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.556086 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.556055 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ndnrb" event={"ID":"ac8f5041-719a-463a-be2b-58da5280e1b9","Type":"ContainerDied","Data":"a84f26797850c192da0205a54bbc45bdb89bbbad9b5a88f1b2703d4b978b6a3d"} Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.556597 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a84f26797850c192da0205a54bbc45bdb89bbbad9b5a88f1b2703d4b978b6a3d" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.559036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerStarted","Data":"8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56"} Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.559211 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.560624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-051f-account-create-update-5jdk4" event={"ID":"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac","Type":"ContainerDied","Data":"3f6df67b10391d2a6955d8c9078f735608fc34b561fc8809107c8e141cb2c484"} Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.560666 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f6df67b10391d2a6955d8c9078f735608fc34b561fc8809107c8e141cb2c484" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.560670 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.562375 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.562379 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" event={"ID":"9ef0d022-c81c-489e-91aa-209be0812ce0","Type":"ContainerDied","Data":"b11d55d058787653793b35706d5c0779f376a191a4f7f6f6ae19fb3d967962ca"} Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.562413 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b11d55d058787653793b35706d5c0779f376a191a4f7f6f6ae19fb3d967962ca" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.564130 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerID="e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf" exitCode=143 Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.564200 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462","Type":"ContainerDied","Data":"e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf"} Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.565570 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" event={"ID":"af30249f-96fd-4efc-a9f1-9d571dc0e104","Type":"ContainerDied","Data":"e3fef180dc6868b8156b53906ddc4a4a578a42d80079310305593be4ccb4ffba"} Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.565598 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3fef180dc6868b8156b53906ddc4a4a578a42d80079310305593be4ccb4ffba" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.565612 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.590967 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.413426178 podStartE2EDuration="6.590952026s" podCreationTimestamp="2026-02-26 20:15:21 +0000 UTC" firstStartedPulling="2026-02-26 20:15:22.775076523 +0000 UTC m=+1265.312044447" lastFinishedPulling="2026-02-26 20:15:26.952602371 +0000 UTC m=+1269.489570295" observedRunningTime="2026-02-26 20:15:27.584571623 +0000 UTC m=+1270.121539557" watchObservedRunningTime="2026-02-26 20:15:27.590952026 +0000 UTC m=+1270.127919950" Feb 26 20:15:28 crc kubenswrapper[4722]: I0226 20:15:28.573394 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-central-agent" containerID="cri-o://1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e" gracePeriod=30 Feb 26 20:15:28 crc kubenswrapper[4722]: I0226 20:15:28.573440 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-notification-agent" containerID="cri-o://a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e" gracePeriod=30 Feb 26 20:15:28 crc kubenswrapper[4722]: I0226 20:15:28.573439 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="sg-core" containerID="cri-o://0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df" gracePeriod=30 Feb 26 20:15:28 crc kubenswrapper[4722]: I0226 20:15:28.573482 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="proxy-httpd" containerID="cri-o://8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56" gracePeriod=30 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.003202 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.004019 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-httpd" containerID="cri-o://7b3465ddec616604602a4e6530f42dfcc4f365abb64e8c419f69a657a16647f2" gracePeriod=30 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.027571 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-log" containerID="cri-o://766644689bb0aa81f0df6f878248035eac6d0d2e74677ba725abe6d4b951b569" gracePeriod=30 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.586960 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f846463-6d0b-474c-bb69-05430903325e" containerID="766644689bb0aa81f0df6f878248035eac6d0d2e74677ba725abe6d4b951b569" exitCode=143 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.587050 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f846463-6d0b-474c-bb69-05430903325e","Type":"ContainerDied","Data":"766644689bb0aa81f0df6f878248035eac6d0d2e74677ba725abe6d4b951b569"} Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.590491 4722 generic.go:334] "Generic (PLEG): container finished" podID="6e123c48-da1a-45ec-900b-d09057a529d7" containerID="8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56" exitCode=0 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.590522 4722 generic.go:334] "Generic (PLEG): container finished" podID="6e123c48-da1a-45ec-900b-d09057a529d7" containerID="0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df" exitCode=2 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.590535 4722 generic.go:334] "Generic (PLEG): container finished" podID="6e123c48-da1a-45ec-900b-d09057a529d7" containerID="a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e" exitCode=0 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.590563 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerDied","Data":"8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56"} Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.590605 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerDied","Data":"0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df"} Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.590615 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerDied","Data":"a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e"} Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.105278 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.281332 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376191 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-scripts\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376328 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-sg-core-conf-yaml\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376369 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-log-httpd\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376429 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvzlj\" (UniqueName: \"kubernetes.io/projected/6e123c48-da1a-45ec-900b-d09057a529d7-kube-api-access-wvzlj\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376570 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-combined-ca-bundle\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376605 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-config-data\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376691 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-run-httpd\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376936 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.377510 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.377542 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.384069 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-scripts" (OuterVolumeSpecName: "scripts") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.384817 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e123c48-da1a-45ec-900b-d09057a529d7-kube-api-access-wvzlj" (OuterVolumeSpecName: "kube-api-access-wvzlj") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "kube-api-access-wvzlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.410279 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.477539 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.478765 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.478783 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.478791 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.478800 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.478809 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvzlj\" (UniqueName: \"kubernetes.io/projected/6e123c48-da1a-45ec-900b-d09057a529d7-kube-api-access-wvzlj\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.504324 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-config-data" (OuterVolumeSpecName: "config-data") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.580466 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.601620 4722 generic.go:334] "Generic (PLEG): container finished" podID="6e123c48-da1a-45ec-900b-d09057a529d7" containerID="1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e" exitCode=0 Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.601662 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerDied","Data":"1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e"} Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.601688 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerDied","Data":"963e930354e3ad7445a306d38610b15c15135eebb76c4a06a846c9bbe3a110f3"} Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.601705 4722 scope.go:117] "RemoveContainer" containerID="8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.601705 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.650520 4722 scope.go:117] "RemoveContainer" containerID="0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.655361 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.669146 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.675318 4722 scope.go:117] "RemoveContainer" containerID="a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.687199 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.687726 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.687794 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.687873 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af30249f-96fd-4efc-a9f1-9d571dc0e104" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.687932 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="af30249f-96fd-4efc-a9f1-9d571dc0e104" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688006 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef0d022-c81c-489e-91aa-209be0812ce0" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688062 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef0d022-c81c-489e-91aa-209be0812ce0" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688121 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-notification-agent" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688220 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-notification-agent" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688291 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-central-agent" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688343 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-central-agent" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688402 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8f5041-719a-463a-be2b-58da5280e1b9" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688453 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8f5041-719a-463a-be2b-58da5280e1b9" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688537 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="sg-core" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688596 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="sg-core" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688659 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="proxy-httpd" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688712 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="proxy-httpd" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688780 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b676a2-eba1-45dd-accd-84f2c1d0eba6" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688831 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b676a2-eba1-45dd-accd-84f2c1d0eba6" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688892 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5cc671-e3c0-4b89-a2db-be576bf17d80" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688947 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5cc671-e3c0-4b89-a2db-be576bf17d80" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689204 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="af30249f-96fd-4efc-a9f1-9d571dc0e104" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689272 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef0d022-c81c-489e-91aa-209be0812ce0" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689330 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="sg-core" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689398 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689455 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8f5041-719a-463a-be2b-58da5280e1b9" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689506 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-central-agent" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689557 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b676a2-eba1-45dd-accd-84f2c1d0eba6" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689608 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-notification-agent" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689667 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5cc671-e3c0-4b89-a2db-be576bf17d80" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689730 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="proxy-httpd" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.691648 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.694323 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.694323 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.700729 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.732295 4722 scope.go:117] "RemoveContainer" containerID="1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.764529 4722 scope.go:117] "RemoveContainer" containerID="8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.766292 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56\": container with ID starting with 8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56 not found: ID does not exist" containerID="8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.766325 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56"} err="failed to get container status \"8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56\": rpc error: code = NotFound desc = could not find container \"8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56\": container with ID starting with 8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56 not found: ID does not exist" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.766345 4722 scope.go:117] "RemoveContainer" containerID="0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.769546 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df\": container with ID starting with 0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df not found: ID does not exist" containerID="0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.769571 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df"} err="failed to get container status \"0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df\": rpc error: code = NotFound desc = could not find container \"0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df\": container with ID starting with 0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df not found: ID does not exist" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.769592 4722 scope.go:117] "RemoveContainer" containerID="a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.773236 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e\": container with ID starting with a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e not found: ID does not exist" containerID="a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.773271 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e"} err="failed to get container status \"a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e\": rpc error: code = NotFound desc = could not find container \"a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e\": container with ID starting with a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e not found: ID does not exist" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.773294 4722 scope.go:117] "RemoveContainer" containerID="1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.777373 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e\": container with ID starting with 1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e not found: ID does not exist" containerID="1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.777439 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e"} err="failed to get container status \"1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e\": rpc error: code = NotFound desc = could not find container \"1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e\": container with ID starting with 1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e not found: ID does not exist" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786309 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-scripts\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786421 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-log-httpd\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786442 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bsbk\" (UniqueName: \"kubernetes.io/projected/66db1f3d-ad31-4c73-bdab-134c962316c3-kube-api-access-2bsbk\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786522 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-run-httpd\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786619 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-config-data\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888394 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888446 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-config-data\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888619 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888673 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-scripts\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888700 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-log-httpd\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888715 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bsbk\" (UniqueName: \"kubernetes.io/projected/66db1f3d-ad31-4c73-bdab-134c962316c3-kube-api-access-2bsbk\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888776 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-run-httpd\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.889320 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-run-httpd\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.889539 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-log-httpd\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.896781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.897707 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.898716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-config-data\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.909042 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-scripts\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.937090 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bsbk\" (UniqueName: \"kubernetes.io/projected/66db1f3d-ad31-4c73-bdab-134c962316c3-kube-api-access-2bsbk\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.009701 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.404837 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.517502 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-combined-ca-bundle\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.517918 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-config-data\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.518071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.518122 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-logs\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.518264 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-httpd-run\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.518289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-scripts\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.518622 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfgtl\" (UniqueName: \"kubernetes.io/projected/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-kube-api-access-wfgtl\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.518658 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-public-tls-certs\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.520463 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-logs" (OuterVolumeSpecName: "logs") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.525459 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-kube-api-access-wfgtl" (OuterVolumeSpecName: "kube-api-access-wfgtl") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "kube-api-access-wfgtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.526479 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-scripts" (OuterVolumeSpecName: "scripts") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.526824 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.569709 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0" (OuterVolumeSpecName: "glance") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "pvc-c3598451-3b65-4991-9779-75a64db7d9c0". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.574071 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.609211 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620834 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620874 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620889 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620900 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfgtl\" (UniqueName: \"kubernetes.io/projected/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-kube-api-access-wfgtl\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620913 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620926 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620956 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") on node \"crc\" " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.632556 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.632875 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462","Type":"ContainerDied","Data":"7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e"} Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.632933 4722 scope.go:117] "RemoveContainer" containerID="7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.634824 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerID="7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e" exitCode=0 Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.634890 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462","Type":"ContainerDied","Data":"6753a3e2e289cf2a9e848d19931c5cf9300f728691e80555a5b2c7595e67c83c"} Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.640459 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.644254 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-config-data" (OuterVolumeSpecName: "config-data") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.680602 4722 scope.go:117] "RemoveContainer" containerID="e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.688853 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.689198 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c3598451-3b65-4991-9779-75a64db7d9c0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0") on node "crc" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.707371 4722 scope.go:117] "RemoveContainer" containerID="7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e" Feb 26 20:15:31 crc kubenswrapper[4722]: E0226 20:15:31.711563 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e\": container with ID starting with 7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e not found: ID does not exist" containerID="7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.711673 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e"} err="failed to get container status \"7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e\": rpc error: code = NotFound desc = could not find container \"7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e\": container with ID starting with 7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e not found: ID does not exist" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.711755 4722 scope.go:117] "RemoveContainer" containerID="e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf" Feb 26 20:15:31 crc kubenswrapper[4722]: E0226 20:15:31.715422 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf\": container with ID starting with e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf not found: ID does not exist" containerID="e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.715485 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf"} err="failed to get container status \"e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf\": rpc error: code = NotFound desc = could not find container \"e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf\": container with ID starting with e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf not found: ID does not exist" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.722948 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.723083 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.965697 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.979558 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.996942 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:15:31 crc kubenswrapper[4722]: E0226 20:15:31.998238 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-httpd" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.998260 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-httpd" Feb 26 20:15:31 crc kubenswrapper[4722]: E0226 20:15:31.998268 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-log" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.998274 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-log" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.998481 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-log" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.998503 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-httpd" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.999626 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.002686 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.002855 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.012659 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.129827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130224 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a45004da-d9b9-4962-a4d3-2a1175e78747-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130247 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-scripts\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130286 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130327 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a45004da-d9b9-4962-a4d3-2a1175e78747-logs\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130368 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130388 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-config-data\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbvvh\" (UniqueName: \"kubernetes.io/projected/a45004da-d9b9-4962-a4d3-2a1175e78747-kube-api-access-gbvvh\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.160819 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" path="/var/lib/kubelet/pods/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462/volumes" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.162570 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" path="/var/lib/kubelet/pods/6e123c48-da1a-45ec-900b-d09057a529d7/volumes" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232271 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232397 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a45004da-d9b9-4962-a4d3-2a1175e78747-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232423 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-scripts\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232479 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232567 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a45004da-d9b9-4962-a4d3-2a1175e78747-logs\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232639 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232667 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-config-data\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbvvh\" (UniqueName: \"kubernetes.io/projected/a45004da-d9b9-4962-a4d3-2a1175e78747-kube-api-access-gbvvh\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.234562 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a45004da-d9b9-4962-a4d3-2a1175e78747-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.236817 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a45004da-d9b9-4962-a4d3-2a1175e78747-logs\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.240275 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-scripts\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.240297 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.240996 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.241022 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7f95abf803007e35619a86adf06d86b927c4178d94ba29cbe93b3d6d49c63693/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.241605 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.245752 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-config-data\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.268822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbvvh\" (UniqueName: \"kubernetes.io/projected/a45004da-d9b9-4962-a4d3-2a1175e78747-kube-api-access-gbvvh\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.317642 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.382433 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.648777 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f846463-6d0b-474c-bb69-05430903325e" containerID="7b3465ddec616604602a4e6530f42dfcc4f365abb64e8c419f69a657a16647f2" exitCode=0 Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.648846 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f846463-6d0b-474c-bb69-05430903325e","Type":"ContainerDied","Data":"7b3465ddec616604602a4e6530f42dfcc4f365abb64e8c419f69a657a16647f2"} Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.650865 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerStarted","Data":"3d2509a4144a5265122158f7dcb76adae8d7a0d7d7477375131743f77b21013a"} Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.106504 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.668077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerStarted","Data":"999b7d09b7a8a8074eb944ef596148e99b9f93b079bf274da605ad6041c27873"} Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.756996 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gbmnp"] Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.760114 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.765261 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-crxb6" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.765366 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.765471 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.786430 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gbmnp"] Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.865701 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.865798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-scripts\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.865851 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-config-data\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.865872 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55mwt\" (UniqueName: \"kubernetes.io/projected/e863110f-e026-4433-8992-8ed0ae33521a-kube-api-access-55mwt\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.967603 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.967717 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-scripts\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.967785 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-config-data\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.967811 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55mwt\" (UniqueName: \"kubernetes.io/projected/e863110f-e026-4433-8992-8ed0ae33521a-kube-api-access-55mwt\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.979809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-scripts\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.983402 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.987126 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55mwt\" (UniqueName: \"kubernetes.io/projected/e863110f-e026-4433-8992-8ed0ae33521a-kube-api-access-55mwt\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.004893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-config-data\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.130546 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.268593 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.420505 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496379 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-scripts\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496430 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-combined-ca-bundle\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496534 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-httpd-run\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496558 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-internal-tls-certs\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496584 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-config-data\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496804 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496924 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-logs\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496955 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/6f846463-6d0b-474c-bb69-05430903325e-kube-api-access-xhppn\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.508796 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-logs" (OuterVolumeSpecName: "logs") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.508822 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.549380 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-scripts" (OuterVolumeSpecName: "scripts") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.555760 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f846463-6d0b-474c-bb69-05430903325e-kube-api-access-xhppn" (OuterVolumeSpecName: "kube-api-access-xhppn") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "kube-api-access-xhppn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.600586 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.600840 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/6f846463-6d0b-474c-bb69-05430903325e-kube-api-access-xhppn\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.600917 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.600979 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.636901 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d" (OuterVolumeSpecName: "glance") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "pvc-b7104307-bea6-42a8-bb91-b3367a15255d". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.658498 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-config-data" (OuterVolumeSpecName: "config-data") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.659261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.713671 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.715230 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") on node \"crc\" " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.715332 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.717772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f846463-6d0b-474c-bb69-05430903325e","Type":"ContainerDied","Data":"e97085fd9ae89289f551beeee4068908739305a6ac14a94c20bf0771fae8222b"} Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.717845 4722 scope.go:117] "RemoveContainer" containerID="7b3465ddec616604602a4e6530f42dfcc4f365abb64e8c419f69a657a16647f2" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.718001 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.726025 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a45004da-d9b9-4962-a4d3-2a1175e78747","Type":"ContainerStarted","Data":"ab0684c9ed9b9f4b11f8da5714df0354c8ad5b1f2b9d198c6c8347b5cf65d169"} Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.731846 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.817351 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.850232 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.850418 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b7104307-bea6-42a8-bb91-b3367a15255d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d") on node "crc" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.892614 4722 scope.go:117] "RemoveContainer" containerID="766644689bb0aa81f0df6f878248035eac6d0d2e74677ba725abe6d4b951b569" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.913131 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gbmnp"] Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.918803 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.075188 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.100974 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.112207 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:15:35 crc kubenswrapper[4722]: E0226 20:15:35.112639 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-httpd" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.112657 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-httpd" Feb 26 20:15:35 crc kubenswrapper[4722]: E0226 20:15:35.112683 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-log" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.112690 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-log" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.112880 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-httpd" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.112902 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-log" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.113997 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.116900 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.122095 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.123353 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.232646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233345 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233499 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233534 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233761 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233790 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bwpj\" (UniqueName: \"kubernetes.io/projected/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-kube-api-access-8bwpj\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233885 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336501 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336533 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336611 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336656 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bwpj\" (UniqueName: \"kubernetes.io/projected/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-kube-api-access-8bwpj\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336740 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336816 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.337835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.339624 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.343753 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.343795 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/88f1a6e4b7d38741eb9d773bacda42f6b779f5a286257bf88993c6007250abc8/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.344689 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.346342 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.349617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.352411 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.364711 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bwpj\" (UniqueName: \"kubernetes.io/projected/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-kube-api-access-8bwpj\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.417705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.443892 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.743058 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a45004da-d9b9-4962-a4d3-2a1175e78747","Type":"ContainerStarted","Data":"554f22fdef9e5be2104df5677d75e0af75c90187db7044af985a859b5118d877"} Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.745841 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" event={"ID":"e863110f-e026-4433-8992-8ed0ae33521a","Type":"ContainerStarted","Data":"3fc22fa9bd70d18c96e93d218fd9ee849b96d0f66628cde73d3023ef82a39a8d"} Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.750413 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerStarted","Data":"ef62e736f7171cc9e730c776ca46402dc708363e9afce8e98254786f90dd1090"} Feb 26 20:15:36 crc kubenswrapper[4722]: I0226 20:15:36.110264 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:15:36 crc kubenswrapper[4722]: I0226 20:15:36.201987 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f846463-6d0b-474c-bb69-05430903325e" path="/var/lib/kubelet/pods/6f846463-6d0b-474c-bb69-05430903325e/volumes" Feb 26 20:15:36 crc kubenswrapper[4722]: I0226 20:15:36.766090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a665ecb-6cf5-402f-aee1-26ebfcd9583c","Type":"ContainerStarted","Data":"1dffecc2c1e15ea86e46c8549553553cf5a5af81db7b6b0474b5a5925c8dcfe0"} Feb 26 20:15:36 crc kubenswrapper[4722]: I0226 20:15:36.775204 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerStarted","Data":"de0146c9ff19bb907ef8bc9b9a12c15421fcc63eb0e5a74a220b73239a205dfc"} Feb 26 20:15:36 crc kubenswrapper[4722]: I0226 20:15:36.801332 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a45004da-d9b9-4962-a4d3-2a1175e78747","Type":"ContainerStarted","Data":"2c0803a4079590f0706e107e7dbfe058a23dbceed0d20caed2f31101e7778fe9"} Feb 26 20:15:36 crc kubenswrapper[4722]: E0226 20:15:36.860452 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810fb24_84d9_45c8_9456_7d1a6c6c8fff.slice/crio-3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:15:36 crc kubenswrapper[4722]: I0226 20:15:36.862070 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.862049899 podStartE2EDuration="5.862049899s" podCreationTimestamp="2026-02-26 20:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:36.851599894 +0000 UTC m=+1279.388567818" watchObservedRunningTime="2026-02-26 20:15:36.862049899 +0000 UTC m=+1279.399017823" Feb 26 20:15:37 crc kubenswrapper[4722]: I0226 20:15:37.819623 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a665ecb-6cf5-402f-aee1-26ebfcd9583c","Type":"ContainerStarted","Data":"44167eefa0be2513bba8ced26f5f9956fa64d0b1fe73d872d7790d7f080fd4ef"} Feb 26 20:15:37 crc kubenswrapper[4722]: I0226 20:15:37.820258 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a665ecb-6cf5-402f-aee1-26ebfcd9583c","Type":"ContainerStarted","Data":"26a573cccf7105f1749bbe7ab88abe5e01c8fa90677197e3751476e5b99cf4c2"} Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.167727 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.167703179 podStartE2EDuration="3.167703179s" podCreationTimestamp="2026-02-26 20:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:37.844177522 +0000 UTC m=+1280.381145456" watchObservedRunningTime="2026-02-26 20:15:38.167703179 +0000 UTC m=+1280.704671123" Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.831433 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerStarted","Data":"9876a86dd33a73cb9c91ef0f2c5824ca4e07765c6315724693880d08b0451446"} Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.831670 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-central-agent" containerID="cri-o://999b7d09b7a8a8074eb944ef596148e99b9f93b079bf274da605ad6041c27873" gracePeriod=30 Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.831709 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="sg-core" containerID="cri-o://de0146c9ff19bb907ef8bc9b9a12c15421fcc63eb0e5a74a220b73239a205dfc" gracePeriod=30 Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.831718 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="proxy-httpd" containerID="cri-o://9876a86dd33a73cb9c91ef0f2c5824ca4e07765c6315724693880d08b0451446" gracePeriod=30 Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.831757 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-notification-agent" containerID="cri-o://ef62e736f7171cc9e730c776ca46402dc708363e9afce8e98254786f90dd1090" gracePeriod=30 Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.854042 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.557704223 podStartE2EDuration="8.854021831s" podCreationTimestamp="2026-02-26 20:15:30 +0000 UTC" firstStartedPulling="2026-02-26 20:15:31.661317216 +0000 UTC m=+1274.198285140" lastFinishedPulling="2026-02-26 20:15:37.957634824 +0000 UTC m=+1280.494602748" observedRunningTime="2026-02-26 20:15:38.849497987 +0000 UTC m=+1281.386465921" watchObservedRunningTime="2026-02-26 20:15:38.854021831 +0000 UTC m=+1281.390989755" Feb 26 20:15:39 crc kubenswrapper[4722]: I0226 20:15:39.875372 4722 generic.go:334] "Generic (PLEG): container finished" podID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerID="9876a86dd33a73cb9c91ef0f2c5824ca4e07765c6315724693880d08b0451446" exitCode=0 Feb 26 20:15:39 crc kubenswrapper[4722]: I0226 20:15:39.875725 4722 generic.go:334] "Generic (PLEG): container finished" podID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerID="de0146c9ff19bb907ef8bc9b9a12c15421fcc63eb0e5a74a220b73239a205dfc" exitCode=2 Feb 26 20:15:39 crc kubenswrapper[4722]: I0226 20:15:39.875740 4722 generic.go:334] "Generic (PLEG): container finished" podID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerID="ef62e736f7171cc9e730c776ca46402dc708363e9afce8e98254786f90dd1090" exitCode=0 Feb 26 20:15:39 crc kubenswrapper[4722]: I0226 20:15:39.875764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerDied","Data":"9876a86dd33a73cb9c91ef0f2c5824ca4e07765c6315724693880d08b0451446"} Feb 26 20:15:39 crc kubenswrapper[4722]: I0226 20:15:39.875796 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerDied","Data":"de0146c9ff19bb907ef8bc9b9a12c15421fcc63eb0e5a74a220b73239a205dfc"} Feb 26 20:15:39 crc kubenswrapper[4722]: I0226 20:15:39.875810 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerDied","Data":"ef62e736f7171cc9e730c776ca46402dc708363e9afce8e98254786f90dd1090"} Feb 26 20:15:41 crc kubenswrapper[4722]: I0226 20:15:41.739675 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 26 20:15:42 crc kubenswrapper[4722]: I0226 20:15:42.383614 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 20:15:42 crc kubenswrapper[4722]: I0226 20:15:42.385103 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 20:15:42 crc kubenswrapper[4722]: I0226 20:15:42.434309 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 20:15:42 crc kubenswrapper[4722]: I0226 20:15:42.547812 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 20:15:42 crc kubenswrapper[4722]: I0226 20:15:42.905744 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 20:15:42 crc kubenswrapper[4722]: I0226 20:15:42.905787 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 20:15:43 crc kubenswrapper[4722]: I0226 20:15:43.916630 4722 generic.go:334] "Generic (PLEG): container finished" podID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerID="999b7d09b7a8a8074eb944ef596148e99b9f93b079bf274da605ad6041c27873" exitCode=0 Feb 26 20:15:43 crc kubenswrapper[4722]: I0226 20:15:43.916723 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerDied","Data":"999b7d09b7a8a8074eb944ef596148e99b9f93b079bf274da605ad6041c27873"} Feb 26 20:15:44 crc kubenswrapper[4722]: I0226 20:15:44.781157 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 20:15:44 crc kubenswrapper[4722]: I0226 20:15:44.786687 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 20:15:45 crc kubenswrapper[4722]: I0226 20:15:45.444562 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:45 crc kubenswrapper[4722]: I0226 20:15:45.444929 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:45 crc kubenswrapper[4722]: I0226 20:15:45.488910 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:45 crc kubenswrapper[4722]: I0226 20:15:45.489363 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:45 crc kubenswrapper[4722]: I0226 20:15:45.988806 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:45 crc kubenswrapper[4722]: I0226 20:15:45.989320 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.295951 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.422528 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-log-httpd\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.422810 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-sg-core-conf-yaml\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.422871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-config-data\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.422912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-scripts\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.422938 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-run-httpd\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.422965 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-combined-ca-bundle\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.423001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bsbk\" (UniqueName: \"kubernetes.io/projected/66db1f3d-ad31-4c73-bdab-134c962316c3-kube-api-access-2bsbk\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.423167 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.423288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.423657 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.423675 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.430436 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-scripts" (OuterVolumeSpecName: "scripts") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.436798 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66db1f3d-ad31-4c73-bdab-134c962316c3-kube-api-access-2bsbk" (OuterVolumeSpecName: "kube-api-access-2bsbk") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "kube-api-access-2bsbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.476261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.526500 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.526530 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.526540 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bsbk\" (UniqueName: \"kubernetes.io/projected/66db1f3d-ad31-4c73-bdab-134c962316c3-kube-api-access-2bsbk\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.555685 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-config-data" (OuterVolumeSpecName: "config-data") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.567018 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.629056 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.629115 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.000669 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerDied","Data":"3d2509a4144a5265122158f7dcb76adae8d7a0d7d7477375131743f77b21013a"} Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.000725 4722 scope.go:117] "RemoveContainer" containerID="9876a86dd33a73cb9c91ef0f2c5824ca4e07765c6315724693880d08b0451446" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.000882 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.006412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" event={"ID":"e863110f-e026-4433-8992-8ed0ae33521a","Type":"ContainerStarted","Data":"741050786bb3d29947da2bc78a8be1e7b66276aeb94e7449d6dc83ed51875a07"} Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.039880 4722 scope.go:117] "RemoveContainer" containerID="de0146c9ff19bb907ef8bc9b9a12c15421fcc63eb0e5a74a220b73239a205dfc" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.054337 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" podStartSLOduration=3.16962958 podStartE2EDuration="14.054313663s" podCreationTimestamp="2026-02-26 20:15:33 +0000 UTC" firstStartedPulling="2026-02-26 20:15:34.912300007 +0000 UTC m=+1277.449267931" lastFinishedPulling="2026-02-26 20:15:45.79698409 +0000 UTC m=+1288.333952014" observedRunningTime="2026-02-26 20:15:47.027510363 +0000 UTC m=+1289.564478297" watchObservedRunningTime="2026-02-26 20:15:47.054313663 +0000 UTC m=+1289.591281607" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.070276 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.088305 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.096269 4722 scope.go:117] "RemoveContainer" containerID="ef62e736f7171cc9e730c776ca46402dc708363e9afce8e98254786f90dd1090" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.105866 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:47 crc kubenswrapper[4722]: E0226 20:15:47.112555 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="proxy-httpd" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112587 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="proxy-httpd" Feb 26 20:15:47 crc kubenswrapper[4722]: E0226 20:15:47.112606 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-notification-agent" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112612 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-notification-agent" Feb 26 20:15:47 crc kubenswrapper[4722]: E0226 20:15:47.112622 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="sg-core" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112628 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="sg-core" Feb 26 20:15:47 crc kubenswrapper[4722]: E0226 20:15:47.112656 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-central-agent" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112662 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-central-agent" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112923 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="sg-core" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112937 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-central-agent" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112959 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-notification-agent" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112969 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="proxy-httpd" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.122342 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.129736 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.142603 4722 scope.go:117] "RemoveContainer" containerID="999b7d09b7a8a8074eb944ef596148e99b9f93b079bf274da605ad6041c27873" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.143072 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.143072 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:15:47 crc kubenswrapper[4722]: E0226 20:15:47.155660 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66db1f3d_ad31_4c73_bdab_134c962316c3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810fb24_84d9_45c8_9456_7d1a6c6c8fff.slice/crio-3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66db1f3d_ad31_4c73_bdab_134c962316c3.slice/crio-3d2509a4144a5265122158f7dcb76adae8d7a0d7d7477375131743f77b21013a\": RecentStats: unable to find data in memory cache]" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.249886 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-scripts\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.249946 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-log-httpd\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.249974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9m9\" (UniqueName: \"kubernetes.io/projected/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-kube-api-access-7t9m9\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.250046 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-run-httpd\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.250110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.250168 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-config-data\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.250198 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.351902 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-scripts\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.352276 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-log-httpd\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.352381 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9m9\" (UniqueName: \"kubernetes.io/projected/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-kube-api-access-7t9m9\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.352536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-run-httpd\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.353015 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.353473 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-config-data\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.353593 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.352947 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-run-httpd\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.352701 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-log-httpd\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.357654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.357938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-scripts\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.366163 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.370118 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9m9\" (UniqueName: \"kubernetes.io/projected/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-kube-api-access-7t9m9\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.370197 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-config-data\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.450651 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.964745 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.975529 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.979676 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:48 crc kubenswrapper[4722]: I0226 20:15:48.015441 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerStarted","Data":"fc0acbb0da9f4ea0af3e43e464c3a373eda8f3cb58bf47e62be090646d2e21ac"} Feb 26 20:15:48 crc kubenswrapper[4722]: I0226 20:15:48.157235 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" path="/var/lib/kubelet/pods/66db1f3d-ad31-4c73-bdab-134c962316c3/volumes" Feb 26 20:15:49 crc kubenswrapper[4722]: I0226 20:15:49.072844 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerStarted","Data":"ac2f3b07e9cb38292f9a8f116cc36245d0cb46f50c0d7e8903e1155048757f1f"} Feb 26 20:15:50 crc kubenswrapper[4722]: I0226 20:15:50.087700 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerStarted","Data":"5de6c6e11809b987a2283569350c76b045aacb18e1459f5a68ca1b9956ac0606"} Feb 26 20:15:51 crc kubenswrapper[4722]: I0226 20:15:51.918276 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:52 crc kubenswrapper[4722]: I0226 20:15:52.109220 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerStarted","Data":"29820dcf2231bb5b66e448a2cd4fa48f3786d147a2370ac7764d15a35e5be118"} Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.147539 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerStarted","Data":"85e8f05367a744c0f4e09a6527c065c7997ae1eeef2dfa520172d997309a69d0"} Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.148338 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-central-agent" containerID="cri-o://ac2f3b07e9cb38292f9a8f116cc36245d0cb46f50c0d7e8903e1155048757f1f" gracePeriod=30 Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.148640 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.148960 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="proxy-httpd" containerID="cri-o://85e8f05367a744c0f4e09a6527c065c7997ae1eeef2dfa520172d997309a69d0" gracePeriod=30 Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.149016 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="sg-core" containerID="cri-o://29820dcf2231bb5b66e448a2cd4fa48f3786d147a2370ac7764d15a35e5be118" gracePeriod=30 Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.149066 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-notification-agent" containerID="cri-o://5de6c6e11809b987a2283569350c76b045aacb18e1459f5a68ca1b9956ac0606" gracePeriod=30 Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.181191 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.007831607 podStartE2EDuration="8.181171654s" podCreationTimestamp="2026-02-26 20:15:47 +0000 UTC" firstStartedPulling="2026-02-26 20:15:47.961688849 +0000 UTC m=+1290.498656773" lastFinishedPulling="2026-02-26 20:15:54.135028896 +0000 UTC m=+1296.671996820" observedRunningTime="2026-02-26 20:15:55.169809224 +0000 UTC m=+1297.706777158" watchObservedRunningTime="2026-02-26 20:15:55.181171654 +0000 UTC m=+1297.718139578" Feb 26 20:15:56 crc kubenswrapper[4722]: I0226 20:15:56.163543 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerID="85e8f05367a744c0f4e09a6527c065c7997ae1eeef2dfa520172d997309a69d0" exitCode=0 Feb 26 20:15:56 crc kubenswrapper[4722]: I0226 20:15:56.163864 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerID="29820dcf2231bb5b66e448a2cd4fa48f3786d147a2370ac7764d15a35e5be118" exitCode=2 Feb 26 20:15:56 crc kubenswrapper[4722]: I0226 20:15:56.163875 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerID="5de6c6e11809b987a2283569350c76b045aacb18e1459f5a68ca1b9956ac0606" exitCode=0 Feb 26 20:15:56 crc kubenswrapper[4722]: I0226 20:15:56.163896 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerDied","Data":"85e8f05367a744c0f4e09a6527c065c7997ae1eeef2dfa520172d997309a69d0"} Feb 26 20:15:56 crc kubenswrapper[4722]: I0226 20:15:56.163923 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerDied","Data":"29820dcf2231bb5b66e448a2cd4fa48f3786d147a2370ac7764d15a35e5be118"} Feb 26 20:15:56 crc kubenswrapper[4722]: I0226 20:15:56.163932 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerDied","Data":"5de6c6e11809b987a2283569350c76b045aacb18e1459f5a68ca1b9956ac0606"} Feb 26 20:15:57 crc kubenswrapper[4722]: E0226 20:15:57.396735 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810fb24_84d9_45c8_9456_7d1a6c6c8fff.slice/crio-3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.184390 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerID="ac2f3b07e9cb38292f9a8f116cc36245d0cb46f50c0d7e8903e1155048757f1f" exitCode=0 Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.184586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerDied","Data":"ac2f3b07e9cb38292f9a8f116cc36245d0cb46f50c0d7e8903e1155048757f1f"} Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.184772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerDied","Data":"fc0acbb0da9f4ea0af3e43e464c3a373eda8f3cb58bf47e62be090646d2e21ac"} Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.184783 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc0acbb0da9f4ea0af3e43e464c3a373eda8f3cb58bf47e62be090646d2e21ac" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.351686 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.386824 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t9m9\" (UniqueName: \"kubernetes.io/projected/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-kube-api-access-7t9m9\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.386943 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-scripts\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.386987 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-run-httpd\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.387037 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-log-httpd\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.387164 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-config-data\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.387203 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-combined-ca-bundle\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.387256 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-sg-core-conf-yaml\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.388774 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.389675 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.396888 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-scripts" (OuterVolumeSpecName: "scripts") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.404644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-kube-api-access-7t9m9" (OuterVolumeSpecName: "kube-api-access-7t9m9") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "kube-api-access-7t9m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.430774 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.473980 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.490083 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.490115 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.490127 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.490147 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.490159 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.490167 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t9m9\" (UniqueName: \"kubernetes.io/projected/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-kube-api-access-7t9m9\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.517313 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-config-data" (OuterVolumeSpecName: "config-data") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.592538 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.194896 4722 generic.go:334] "Generic (PLEG): container finished" podID="e863110f-e026-4433-8992-8ed0ae33521a" containerID="741050786bb3d29947da2bc78a8be1e7b66276aeb94e7449d6dc83ed51875a07" exitCode=0 Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.194983 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" event={"ID":"e863110f-e026-4433-8992-8ed0ae33521a","Type":"ContainerDied","Data":"741050786bb3d29947da2bc78a8be1e7b66276aeb94e7449d6dc83ed51875a07"} Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.195348 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.247874 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.257794 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.268786 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:59 crc kubenswrapper[4722]: E0226 20:15:59.269232 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="proxy-httpd" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269251 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="proxy-httpd" Feb 26 20:15:59 crc kubenswrapper[4722]: E0226 20:15:59.269261 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="sg-core" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269268 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="sg-core" Feb 26 20:15:59 crc kubenswrapper[4722]: E0226 20:15:59.269300 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-notification-agent" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269306 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-notification-agent" Feb 26 20:15:59 crc kubenswrapper[4722]: E0226 20:15:59.269318 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-central-agent" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269323 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-central-agent" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269499 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-notification-agent" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269516 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-central-agent" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269527 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="proxy-httpd" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269539 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="sg-core" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.272268 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.274920 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.275187 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.281016 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.403737 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.403891 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-scripts\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.403945 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-log-httpd\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.404048 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.404094 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-run-httpd\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.404123 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-config-data\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.404167 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzx4n\" (UniqueName: \"kubernetes.io/projected/6155bd98-22a4-476d-9572-8f172f4e8cc2-kube-api-access-vzx4n\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-scripts\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505714 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-log-httpd\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505776 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505799 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-run-httpd\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-config-data\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzx4n\" (UniqueName: \"kubernetes.io/projected/6155bd98-22a4-476d-9572-8f172f4e8cc2-kube-api-access-vzx4n\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.507081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-log-httpd\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.507182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-run-httpd\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.510201 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.510565 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-scripts\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.511047 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.512599 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-config-data\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.524654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzx4n\" (UniqueName: \"kubernetes.io/projected/6155bd98-22a4-476d-9572-8f172f4e8cc2-kube-api-access-vzx4n\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.630972 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.107474 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.158282 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" path="/var/lib/kubelet/pods/b4f8a59e-1ccd-4880-946b-e6f48907d4d2/volumes" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.159604 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535616-66blr"] Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.160852 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.164260 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.164345 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.164850 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.168404 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535616-66blr"] Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.206359 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerStarted","Data":"8c9cff63477e020d84078860f2efca3214d03c36290d6495bf75e0fc3f652072"} Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.319793 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khbz5\" (UniqueName: \"kubernetes.io/projected/d98b84a0-bedf-45f7-b9ca-14244b272795-kube-api-access-khbz5\") pod \"auto-csr-approver-29535616-66blr\" (UID: \"d98b84a0-bedf-45f7-b9ca-14244b272795\") " pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.421824 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khbz5\" (UniqueName: \"kubernetes.io/projected/d98b84a0-bedf-45f7-b9ca-14244b272795-kube-api-access-khbz5\") pod \"auto-csr-approver-29535616-66blr\" (UID: \"d98b84a0-bedf-45f7-b9ca-14244b272795\") " pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.440308 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khbz5\" (UniqueName: \"kubernetes.io/projected/d98b84a0-bedf-45f7-b9ca-14244b272795-kube-api-access-khbz5\") pod \"auto-csr-approver-29535616-66blr\" (UID: \"d98b84a0-bedf-45f7-b9ca-14244b272795\") " pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.486093 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.776335 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.936157 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55mwt\" (UniqueName: \"kubernetes.io/projected/e863110f-e026-4433-8992-8ed0ae33521a-kube-api-access-55mwt\") pod \"e863110f-e026-4433-8992-8ed0ae33521a\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.936236 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-scripts\") pod \"e863110f-e026-4433-8992-8ed0ae33521a\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.936294 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-config-data\") pod \"e863110f-e026-4433-8992-8ed0ae33521a\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.936387 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-combined-ca-bundle\") pod \"e863110f-e026-4433-8992-8ed0ae33521a\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.943678 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-scripts" (OuterVolumeSpecName: "scripts") pod "e863110f-e026-4433-8992-8ed0ae33521a" (UID: "e863110f-e026-4433-8992-8ed0ae33521a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.943822 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e863110f-e026-4433-8992-8ed0ae33521a-kube-api-access-55mwt" (OuterVolumeSpecName: "kube-api-access-55mwt") pod "e863110f-e026-4433-8992-8ed0ae33521a" (UID: "e863110f-e026-4433-8992-8ed0ae33521a"). InnerVolumeSpecName "kube-api-access-55mwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.975797 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535616-66blr"] Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.982468 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e863110f-e026-4433-8992-8ed0ae33521a" (UID: "e863110f-e026-4433-8992-8ed0ae33521a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.994307 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-config-data" (OuterVolumeSpecName: "config-data") pod "e863110f-e026-4433-8992-8ed0ae33521a" (UID: "e863110f-e026-4433-8992-8ed0ae33521a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.039158 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.039195 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55mwt\" (UniqueName: \"kubernetes.io/projected/e863110f-e026-4433-8992-8ed0ae33521a-kube-api-access-55mwt\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.039206 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.039214 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.217947 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" event={"ID":"e863110f-e026-4433-8992-8ed0ae33521a","Type":"ContainerDied","Data":"3fc22fa9bd70d18c96e93d218fd9ee849b96d0f66628cde73d3023ef82a39a8d"} Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.217985 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fc22fa9bd70d18c96e93d218fd9ee849b96d0f66628cde73d3023ef82a39a8d" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.218022 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.219617 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerStarted","Data":"171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a"} Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.220813 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535616-66blr" event={"ID":"d98b84a0-bedf-45f7-b9ca-14244b272795","Type":"ContainerStarted","Data":"7563d45217f4f8938e013bfb7a94ac801b68637e19fe8c788f6b83f86d9d761d"} Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.323355 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 20:16:01 crc kubenswrapper[4722]: E0226 20:16:01.323794 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e863110f-e026-4433-8992-8ed0ae33521a" containerName="nova-cell0-conductor-db-sync" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.323815 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e863110f-e026-4433-8992-8ed0ae33521a" containerName="nova-cell0-conductor-db-sync" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.324017 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e863110f-e026-4433-8992-8ed0ae33521a" containerName="nova-cell0-conductor-db-sync" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.324751 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.331358 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.335398 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-crxb6" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.354741 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.454583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a25c7f-6346-4ce4-ba05-130047eee9b5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.454946 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a25c7f-6346-4ce4-ba05-130047eee9b5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.455018 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swnwf\" (UniqueName: \"kubernetes.io/projected/94a25c7f-6346-4ce4-ba05-130047eee9b5-kube-api-access-swnwf\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.584822 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a25c7f-6346-4ce4-ba05-130047eee9b5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.584896 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a25c7f-6346-4ce4-ba05-130047eee9b5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.584954 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swnwf\" (UniqueName: \"kubernetes.io/projected/94a25c7f-6346-4ce4-ba05-130047eee9b5-kube-api-access-swnwf\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.661024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a25c7f-6346-4ce4-ba05-130047eee9b5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.661100 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a25c7f-6346-4ce4-ba05-130047eee9b5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.670801 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swnwf\" (UniqueName: \"kubernetes.io/projected/94a25c7f-6346-4ce4-ba05-130047eee9b5-kube-api-access-swnwf\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.942420 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:02 crc kubenswrapper[4722]: I0226 20:16:02.373472 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerStarted","Data":"5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a"} Feb 26 20:16:02 crc kubenswrapper[4722]: I0226 20:16:02.604953 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.383279 4722 generic.go:334] "Generic (PLEG): container finished" podID="d98b84a0-bedf-45f7-b9ca-14244b272795" containerID="81fe767a7e621adb64ce8e5396af5dd28bd140b17e573360f334905d10b289a2" exitCode=0 Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.383371 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535616-66blr" event={"ID":"d98b84a0-bedf-45f7-b9ca-14244b272795","Type":"ContainerDied","Data":"81fe767a7e621adb64ce8e5396af5dd28bd140b17e573360f334905d10b289a2"} Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.385289 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"94a25c7f-6346-4ce4-ba05-130047eee9b5","Type":"ContainerStarted","Data":"369d229ee8a85df34510449eaf6b86a5e8766a91573b0f6c0f29ae8f19930fc1"} Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.385335 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"94a25c7f-6346-4ce4-ba05-130047eee9b5","Type":"ContainerStarted","Data":"7f3b31bfa3b6138dddf6047675d3959d4787f1178e76340e0b14621537308b57"} Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.385377 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.387268 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerStarted","Data":"d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747"} Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.430961 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.430942494 podStartE2EDuration="2.430942494s" podCreationTimestamp="2026-02-26 20:16:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:03.421062124 +0000 UTC m=+1305.958030038" watchObservedRunningTime="2026-02-26 20:16:03.430942494 +0000 UTC m=+1305.967910418" Feb 26 20:16:04 crc kubenswrapper[4722]: I0226 20:16:04.906305 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.071981 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khbz5\" (UniqueName: \"kubernetes.io/projected/d98b84a0-bedf-45f7-b9ca-14244b272795-kube-api-access-khbz5\") pod \"d98b84a0-bedf-45f7-b9ca-14244b272795\" (UID: \"d98b84a0-bedf-45f7-b9ca-14244b272795\") " Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.080329 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98b84a0-bedf-45f7-b9ca-14244b272795-kube-api-access-khbz5" (OuterVolumeSpecName: "kube-api-access-khbz5") pod "d98b84a0-bedf-45f7-b9ca-14244b272795" (UID: "d98b84a0-bedf-45f7-b9ca-14244b272795"). InnerVolumeSpecName "kube-api-access-khbz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.174549 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khbz5\" (UniqueName: \"kubernetes.io/projected/d98b84a0-bedf-45f7-b9ca-14244b272795-kube-api-access-khbz5\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.410427 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerStarted","Data":"7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d"} Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.411798 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.414018 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535616-66blr" event={"ID":"d98b84a0-bedf-45f7-b9ca-14244b272795","Type":"ContainerDied","Data":"7563d45217f4f8938e013bfb7a94ac801b68637e19fe8c788f6b83f86d9d761d"} Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.414049 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7563d45217f4f8938e013bfb7a94ac801b68637e19fe8c788f6b83f86d9d761d" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.414093 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.459225 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.769702753 podStartE2EDuration="6.459206904s" podCreationTimestamp="2026-02-26 20:15:59 +0000 UTC" firstStartedPulling="2026-02-26 20:16:00.114939931 +0000 UTC m=+1302.651907855" lastFinishedPulling="2026-02-26 20:16:04.804444082 +0000 UTC m=+1307.341412006" observedRunningTime="2026-02-26 20:16:05.453755996 +0000 UTC m=+1307.990723920" watchObservedRunningTime="2026-02-26 20:16:05.459206904 +0000 UTC m=+1307.996174828" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.976634 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535610-5gtlr"] Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.987798 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535610-5gtlr"] Feb 26 20:16:06 crc kubenswrapper[4722]: I0226 20:16:06.158630 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d4066f0-78d5-4810-9b52-358ed4e1efbd" path="/var/lib/kubelet/pods/7d4066f0-78d5-4810-9b52-358ed4e1efbd/volumes" Feb 26 20:16:07 crc kubenswrapper[4722]: E0226 20:16:07.657094 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810fb24_84d9_45c8_9456_7d1a6c6c8fff.slice/crio-3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:16:11 crc kubenswrapper[4722]: I0226 20:16:11.971439 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.460183 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-kjxc5"] Feb 26 20:16:12 crc kubenswrapper[4722]: E0226 20:16:12.460947 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98b84a0-bedf-45f7-b9ca-14244b272795" containerName="oc" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.460970 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98b84a0-bedf-45f7-b9ca-14244b272795" containerName="oc" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.461232 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98b84a0-bedf-45f7-b9ca-14244b272795" containerName="oc" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.462150 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.463740 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.471581 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjxc5"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.471781 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.628962 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.631405 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.637658 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.640303 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-config-data\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.640473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.640523 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xwg7\" (UniqueName: \"kubernetes.io/projected/19cd0379-1ef6-4db2-b900-2ca9efaf0452-kube-api-access-6xwg7\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.640554 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-scripts\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.670401 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.697698 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.699542 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.707527 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744600 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzhtn\" (UniqueName: \"kubernetes.io/projected/458148a5-b954-49a8-81b8-5b5505dbd46c-kube-api-access-jzhtn\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xwg7\" (UniqueName: \"kubernetes.io/projected/19cd0379-1ef6-4db2-b900-2ca9efaf0452-kube-api-access-6xwg7\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744742 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-scripts\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744792 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-config-data\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744839 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744893 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458148a5-b954-49a8-81b8-5b5505dbd46c-logs\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.756228 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.762655 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.766510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-scripts\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.780858 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-config-data\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.807203 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.808855 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.814211 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.838930 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846612 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458148a5-b954-49a8-81b8-5b5505dbd46c-logs\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846703 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846741 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkqtv\" (UniqueName: \"kubernetes.io/projected/a39c3d27-7241-4634-87af-841ab87e17c0-kube-api-access-xkqtv\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846778 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzhtn\" (UniqueName: \"kubernetes.io/projected/458148a5-b954-49a8-81b8-5b5505dbd46c-kube-api-access-jzhtn\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846900 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.850090 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xwg7\" (UniqueName: \"kubernetes.io/projected/19cd0379-1ef6-4db2-b900-2ca9efaf0452-kube-api-access-6xwg7\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.851413 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458148a5-b954-49a8-81b8-5b5505dbd46c-logs\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.865822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.869120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.889200 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.890471 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.896457 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.949040 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkqtv\" (UniqueName: \"kubernetes.io/projected/a39c3d27-7241-4634-87af-841ab87e17c0-kube-api-access-xkqtv\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950216 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-config-data\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950304 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc450ab2-f2fd-45a5-9ced-e90c59534894-logs\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950368 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c56n6\" (UniqueName: \"kubernetes.io/projected/fc450ab2-f2fd-45a5-9ced-e90c59534894-kube-api-access-c56n6\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.962353 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.962786 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.964483 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzhtn\" (UniqueName: \"kubernetes.io/projected/458148a5-b954-49a8-81b8-5b5505dbd46c-kube-api-access-jzhtn\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.987676 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.003893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkqtv\" (UniqueName: \"kubernetes.io/projected/a39c3d27-7241-4634-87af-841ab87e17c0-kube-api-access-xkqtv\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.027099 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071414 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-config-data\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071530 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc450ab2-f2fd-45a5-9ced-e90c59534894-logs\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-config-data\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071627 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck5pq\" (UniqueName: \"kubernetes.io/projected/5f7a073a-d911-45e9-8a1d-75de83fa586e-kube-api-access-ck5pq\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071656 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071681 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c56n6\" (UniqueName: \"kubernetes.io/projected/fc450ab2-f2fd-45a5-9ced-e90c59534894-kube-api-access-c56n6\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.072967 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc450ab2-f2fd-45a5-9ced-e90c59534894-logs\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.085869 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-config-data\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.094761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.110587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.127395 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c9cb78d75-d525c"] Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.129548 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.150683 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c56n6\" (UniqueName: \"kubernetes.io/projected/fc450ab2-f2fd-45a5-9ced-e90c59534894-kube-api-access-c56n6\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.151917 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c9cb78d75-d525c"] Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.173390 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-config-data\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.173461 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck5pq\" (UniqueName: \"kubernetes.io/projected/5f7a073a-d911-45e9-8a1d-75de83fa586e-kube-api-access-ck5pq\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.173601 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.193242 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-config-data\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.195844 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.205364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck5pq\" (UniqueName: \"kubernetes.io/projected/5f7a073a-d911-45e9-8a1d-75de83fa586e-kube-api-access-ck5pq\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.275151 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-nb\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.275234 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r82p\" (UniqueName: \"kubernetes.io/projected/eaffdc9e-b717-46c2-929f-791a7940268f-kube-api-access-6r82p\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.275282 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-sb\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.275337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-config\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.275957 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-svc\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.276068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-swift-storage-0\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.388920 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-swift-storage-0\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.389377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-nb\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.389486 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r82p\" (UniqueName: \"kubernetes.io/projected/eaffdc9e-b717-46c2-929f-791a7940268f-kube-api-access-6r82p\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.389557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-sb\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.389673 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-config\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.389786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-svc\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.390518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-swift-storage-0\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.390552 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-nb\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.390888 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-svc\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.391119 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.391305 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-sb\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.398288 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-config\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.411727 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.438414 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r82p\" (UniqueName: \"kubernetes.io/projected/eaffdc9e-b717-46c2-929f-791a7940268f-kube-api-access-6r82p\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.470612 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.744399 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.931393 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.007181 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rvgw9"] Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.008584 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.010662 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.010875 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.019036 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rvgw9"] Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.088719 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjxc5"] Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.126488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-scripts\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.126854 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f659\" (UniqueName: \"kubernetes.io/projected/85ac107a-489c-4551-a4ed-49cd15006d82-kube-api-access-9f659\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.126926 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-config-data\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.127006 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.229019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-scripts\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.229112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f659\" (UniqueName: \"kubernetes.io/projected/85ac107a-489c-4551-a4ed-49cd15006d82-kube-api-access-9f659\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.229270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-config-data\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.229461 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.237086 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.238592 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-scripts\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.239186 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-config-data\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.273966 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f659\" (UniqueName: \"kubernetes.io/projected/85ac107a-489c-4551-a4ed-49cd15006d82-kube-api-access-9f659\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.343421 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.368251 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.388484 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:14 crc kubenswrapper[4722]: W0226 20:16:14.390921 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f7a073a_d911_45e9_8a1d_75de83fa586e.slice/crio-50354850fd29dab3642698b484efae5930600590c0af98c64e5d3b26302f0f06 WatchSource:0}: Error finding container 50354850fd29dab3642698b484efae5930600590c0af98c64e5d3b26302f0f06: Status 404 returned error can't find the container with id 50354850fd29dab3642698b484efae5930600590c0af98c64e5d3b26302f0f06 Feb 26 20:16:14 crc kubenswrapper[4722]: W0226 20:16:14.397282 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc450ab2_f2fd_45a5_9ced_e90c59534894.slice/crio-f5ea34ed47b30d91a3c0c0a2da99c044b3d0fa85f6b23fed4cc520939c4ddbb6 WatchSource:0}: Error finding container f5ea34ed47b30d91a3c0c0a2da99c044b3d0fa85f6b23fed4cc520939c4ddbb6: Status 404 returned error can't find the container with id f5ea34ed47b30d91a3c0c0a2da99c044b3d0fa85f6b23fed4cc520939c4ddbb6 Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.548761 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc450ab2-f2fd-45a5-9ced-e90c59534894","Type":"ContainerStarted","Data":"f5ea34ed47b30d91a3c0c0a2da99c044b3d0fa85f6b23fed4cc520939c4ddbb6"} Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.551673 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c9cb78d75-d525c"] Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.566354 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a39c3d27-7241-4634-87af-841ab87e17c0","Type":"ContainerStarted","Data":"bcc64080597b2ae7a7214cbe36c9c6e88ca6123db9749e8dfafd7532df58e64d"} Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.573067 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f7a073a-d911-45e9-8a1d-75de83fa586e","Type":"ContainerStarted","Data":"50354850fd29dab3642698b484efae5930600590c0af98c64e5d3b26302f0f06"} Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.584517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458148a5-b954-49a8-81b8-5b5505dbd46c","Type":"ContainerStarted","Data":"cce2670f3da4c5ee06b06b9e0a4e5eff97452bf4c62188109c79c282ca267fdf"} Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.597052 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjxc5" event={"ID":"19cd0379-1ef6-4db2-b900-2ca9efaf0452","Type":"ContainerStarted","Data":"afeb3c6c9d4df7a35b2c56ba06902433218933267ef411f3e596c6aee9e216c3"} Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.597111 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjxc5" event={"ID":"19cd0379-1ef6-4db2-b900-2ca9efaf0452","Type":"ContainerStarted","Data":"c889a3a4c6e9fb9743150bfd4f92b580d4a8ff043afbbdfe6fdde27ad56a8a45"} Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.654984 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-kjxc5" podStartSLOduration=2.654962453 podStartE2EDuration="2.654962453s" podCreationTimestamp="2026-02-26 20:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:14.625280764 +0000 UTC m=+1317.162248708" watchObservedRunningTime="2026-02-26 20:16:14.654962453 +0000 UTC m=+1317.191930377" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.934597 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rvgw9"] Feb 26 20:16:15 crc kubenswrapper[4722]: I0226 20:16:15.614201 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" event={"ID":"85ac107a-489c-4551-a4ed-49cd15006d82","Type":"ContainerStarted","Data":"95f2ba448ff4845c41ed4591656eae80b72bbc42527cbc23fff03dbb497fffec"} Feb 26 20:16:15 crc kubenswrapper[4722]: I0226 20:16:15.614746 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" event={"ID":"85ac107a-489c-4551-a4ed-49cd15006d82","Type":"ContainerStarted","Data":"c03ba754b0bbc1cbf8ff1c794006b811b3b5656eae21dce0d7c266d754df02e0"} Feb 26 20:16:15 crc kubenswrapper[4722]: I0226 20:16:15.622287 4722 generic.go:334] "Generic (PLEG): container finished" podID="eaffdc9e-b717-46c2-929f-791a7940268f" containerID="0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de" exitCode=0 Feb 26 20:16:15 crc kubenswrapper[4722]: I0226 20:16:15.623087 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" event={"ID":"eaffdc9e-b717-46c2-929f-791a7940268f","Type":"ContainerDied","Data":"0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de"} Feb 26 20:16:15 crc kubenswrapper[4722]: I0226 20:16:15.623148 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" event={"ID":"eaffdc9e-b717-46c2-929f-791a7940268f","Type":"ContainerStarted","Data":"5d5a507d85444f5424a03cabe1cf4e839a26588e0a7cd89e35d3d55ebf30d4dd"} Feb 26 20:16:15 crc kubenswrapper[4722]: I0226 20:16:15.646556 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" podStartSLOduration=2.646540044 podStartE2EDuration="2.646540044s" podCreationTimestamp="2026-02-26 20:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:15.643330026 +0000 UTC m=+1318.180297950" watchObservedRunningTime="2026-02-26 20:16:15.646540044 +0000 UTC m=+1318.183507968" Feb 26 20:16:16 crc kubenswrapper[4722]: I0226 20:16:16.927294 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:16 crc kubenswrapper[4722]: I0226 20:16:16.940901 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:17 crc kubenswrapper[4722]: E0226 20:16:17.958400 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810fb24_84d9_45c8_9456_7d1a6c6c8fff.slice/crio-3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:16:18 crc kubenswrapper[4722]: E0226 20:16:18.173414 4722 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/869651b8bf169ec5b1d93a8c55b0504c076ac16ac416657f266aecde1bb25435/diff" to get inode usage: stat /var/lib/containers/storage/overlay/869651b8bf169ec5b1d93a8c55b0504c076ac16ac416657f266aecde1bb25435/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_neutron-7b7cfb9b54-qvhbm_7810fb24-84d9-45c8-9456-7d1a6c6c8fff/neutron-api/0.log" to get inode usage: stat /var/log/pods/openstack_neutron-7b7cfb9b54-qvhbm_7810fb24-84d9-45c8-9456-7d1a6c6c8fff/neutron-api/0.log: no such file or directory Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.660672 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458148a5-b954-49a8-81b8-5b5505dbd46c","Type":"ContainerStarted","Data":"cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.660711 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458148a5-b954-49a8-81b8-5b5505dbd46c","Type":"ContainerStarted","Data":"900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.663303 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc450ab2-f2fd-45a5-9ced-e90c59534894","Type":"ContainerStarted","Data":"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.663338 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc450ab2-f2fd-45a5-9ced-e90c59534894","Type":"ContainerStarted","Data":"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.663596 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-log" containerID="cri-o://bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5" gracePeriod=30 Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.663827 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-metadata" containerID="cri-o://45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638" gracePeriod=30 Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.667303 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a39c3d27-7241-4634-87af-841ab87e17c0","Type":"ContainerStarted","Data":"8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.667392 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a39c3d27-7241-4634-87af-841ab87e17c0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460" gracePeriod=30 Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.675111 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" event={"ID":"eaffdc9e-b717-46c2-929f-791a7940268f","Type":"ContainerStarted","Data":"4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.675238 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.676773 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f7a073a-d911-45e9-8a1d-75de83fa586e","Type":"ContainerStarted","Data":"87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.691114 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.959328817 podStartE2EDuration="6.691094349s" podCreationTimestamp="2026-02-26 20:16:12 +0000 UTC" firstStartedPulling="2026-02-26 20:16:13.788272925 +0000 UTC m=+1316.325240849" lastFinishedPulling="2026-02-26 20:16:17.520038457 +0000 UTC m=+1320.057006381" observedRunningTime="2026-02-26 20:16:18.681534069 +0000 UTC m=+1321.218502003" watchObservedRunningTime="2026-02-26 20:16:18.691094349 +0000 UTC m=+1321.228062273" Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.716092 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.570779298 podStartE2EDuration="6.7160761s" podCreationTimestamp="2026-02-26 20:16:12 +0000 UTC" firstStartedPulling="2026-02-26 20:16:14.414572212 +0000 UTC m=+1316.951540126" lastFinishedPulling="2026-02-26 20:16:17.559869004 +0000 UTC m=+1320.096836928" observedRunningTime="2026-02-26 20:16:18.701335069 +0000 UTC m=+1321.238302993" watchObservedRunningTime="2026-02-26 20:16:18.7160761 +0000 UTC m=+1321.253044024" Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.734329 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.6192089689999998 podStartE2EDuration="6.734311567s" podCreationTimestamp="2026-02-26 20:16:12 +0000 UTC" firstStartedPulling="2026-02-26 20:16:14.404732924 +0000 UTC m=+1316.941700848" lastFinishedPulling="2026-02-26 20:16:17.519835512 +0000 UTC m=+1320.056803446" observedRunningTime="2026-02-26 20:16:18.721026985 +0000 UTC m=+1321.257994919" watchObservedRunningTime="2026-02-26 20:16:18.734311567 +0000 UTC m=+1321.271279491" Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.741292 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.155867892 podStartE2EDuration="6.741277317s" podCreationTimestamp="2026-02-26 20:16:12 +0000 UTC" firstStartedPulling="2026-02-26 20:16:13.935307292 +0000 UTC m=+1316.472275216" lastFinishedPulling="2026-02-26 20:16:17.520716717 +0000 UTC m=+1320.057684641" observedRunningTime="2026-02-26 20:16:18.738728707 +0000 UTC m=+1321.275696631" watchObservedRunningTime="2026-02-26 20:16:18.741277317 +0000 UTC m=+1321.278245241" Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.770855 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" podStartSLOduration=5.770835963 podStartE2EDuration="5.770835963s" podCreationTimestamp="2026-02-26 20:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:18.75642808 +0000 UTC m=+1321.293396004" watchObservedRunningTime="2026-02-26 20:16:18.770835963 +0000 UTC m=+1321.307803887" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.373483 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.456427 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-config-data\") pod \"fc450ab2-f2fd-45a5-9ced-e90c59534894\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.456535 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc450ab2-f2fd-45a5-9ced-e90c59534894-logs\") pod \"fc450ab2-f2fd-45a5-9ced-e90c59534894\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.456627 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-combined-ca-bundle\") pod \"fc450ab2-f2fd-45a5-9ced-e90c59534894\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.457081 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc450ab2-f2fd-45a5-9ced-e90c59534894-logs" (OuterVolumeSpecName: "logs") pod "fc450ab2-f2fd-45a5-9ced-e90c59534894" (UID: "fc450ab2-f2fd-45a5-9ced-e90c59534894"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.457654 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c56n6\" (UniqueName: \"kubernetes.io/projected/fc450ab2-f2fd-45a5-9ced-e90c59534894-kube-api-access-c56n6\") pod \"fc450ab2-f2fd-45a5-9ced-e90c59534894\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.458321 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc450ab2-f2fd-45a5-9ced-e90c59534894-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.462578 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc450ab2-f2fd-45a5-9ced-e90c59534894-kube-api-access-c56n6" (OuterVolumeSpecName: "kube-api-access-c56n6") pod "fc450ab2-f2fd-45a5-9ced-e90c59534894" (UID: "fc450ab2-f2fd-45a5-9ced-e90c59534894"). InnerVolumeSpecName "kube-api-access-c56n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.484837 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc450ab2-f2fd-45a5-9ced-e90c59534894" (UID: "fc450ab2-f2fd-45a5-9ced-e90c59534894"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.502837 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-config-data" (OuterVolumeSpecName: "config-data") pod "fc450ab2-f2fd-45a5-9ced-e90c59534894" (UID: "fc450ab2-f2fd-45a5-9ced-e90c59534894"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.559972 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.560015 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c56n6\" (UniqueName: \"kubernetes.io/projected/fc450ab2-f2fd-45a5-9ced-e90c59534894-kube-api-access-c56n6\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.560027 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.690794 4722 generic.go:334] "Generic (PLEG): container finished" podID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerID="45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638" exitCode=0 Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.690832 4722 generic.go:334] "Generic (PLEG): container finished" podID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerID="bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5" exitCode=143 Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.690893 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.690959 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc450ab2-f2fd-45a5-9ced-e90c59534894","Type":"ContainerDied","Data":"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638"} Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.690992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc450ab2-f2fd-45a5-9ced-e90c59534894","Type":"ContainerDied","Data":"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5"} Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.691003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc450ab2-f2fd-45a5-9ced-e90c59534894","Type":"ContainerDied","Data":"f5ea34ed47b30d91a3c0c0a2da99c044b3d0fa85f6b23fed4cc520939c4ddbb6"} Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.691018 4722 scope.go:117] "RemoveContainer" containerID="45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.738391 4722 scope.go:117] "RemoveContainer" containerID="bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.743869 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.771081 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.787219 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:19 crc kubenswrapper[4722]: E0226 20:16:19.787885 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-log" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.787904 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-log" Feb 26 20:16:19 crc kubenswrapper[4722]: E0226 20:16:19.787944 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-metadata" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.787951 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-metadata" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.788171 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-log" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.788185 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-metadata" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.789281 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.791857 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.792069 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.823051 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.854735 4722 scope.go:117] "RemoveContainer" containerID="45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638" Feb 26 20:16:19 crc kubenswrapper[4722]: E0226 20:16:19.855232 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638\": container with ID starting with 45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638 not found: ID does not exist" containerID="45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.855258 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638"} err="failed to get container status \"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638\": rpc error: code = NotFound desc = could not find container \"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638\": container with ID starting with 45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638 not found: ID does not exist" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.855281 4722 scope.go:117] "RemoveContainer" containerID="bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5" Feb 26 20:16:19 crc kubenswrapper[4722]: E0226 20:16:19.855963 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5\": container with ID starting with bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5 not found: ID does not exist" containerID="bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.855988 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5"} err="failed to get container status \"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5\": rpc error: code = NotFound desc = could not find container \"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5\": container with ID starting with bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5 not found: ID does not exist" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.856002 4722 scope.go:117] "RemoveContainer" containerID="45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.856211 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638"} err="failed to get container status \"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638\": rpc error: code = NotFound desc = could not find container \"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638\": container with ID starting with 45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638 not found: ID does not exist" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.856232 4722 scope.go:117] "RemoveContainer" containerID="bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.856517 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5"} err="failed to get container status \"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5\": rpc error: code = NotFound desc = could not find container \"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5\": container with ID starting with bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5 not found: ID does not exist" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.866679 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5mp5\" (UniqueName: \"kubernetes.io/projected/15cfff11-3c4a-4be4-b6b5-72544ea7a455-kube-api-access-x5mp5\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.866797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cfff11-3c4a-4be4-b6b5-72544ea7a455-logs\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.866902 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.866963 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-config-data\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.867113 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.969025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5mp5\" (UniqueName: \"kubernetes.io/projected/15cfff11-3c4a-4be4-b6b5-72544ea7a455-kube-api-access-x5mp5\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.969111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cfff11-3c4a-4be4-b6b5-72544ea7a455-logs\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.969196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.969217 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-config-data\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.969294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.969684 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cfff11-3c4a-4be4-b6b5-72544ea7a455-logs\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.972797 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.973914 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.974843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-config-data\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.987824 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5mp5\" (UniqueName: \"kubernetes.io/projected/15cfff11-3c4a-4be4-b6b5-72544ea7a455-kube-api-access-x5mp5\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:20 crc kubenswrapper[4722]: I0226 20:16:20.109059 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:20 crc kubenswrapper[4722]: I0226 20:16:20.163800 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" path="/var/lib/kubelet/pods/fc450ab2-f2fd-45a5-9ced-e90c59534894/volumes" Feb 26 20:16:20 crc kubenswrapper[4722]: I0226 20:16:20.591571 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:20 crc kubenswrapper[4722]: I0226 20:16:20.703965 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cfff11-3c4a-4be4-b6b5-72544ea7a455","Type":"ContainerStarted","Data":"b32b9f4dc0661f437f3dd3dd0ca79f7b5acc0e19867fb4fc59ec0609a2de7103"} Feb 26 20:16:21 crc kubenswrapper[4722]: I0226 20:16:21.718815 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cfff11-3c4a-4be4-b6b5-72544ea7a455","Type":"ContainerStarted","Data":"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08"} Feb 26 20:16:21 crc kubenswrapper[4722]: I0226 20:16:21.719845 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cfff11-3c4a-4be4-b6b5-72544ea7a455","Type":"ContainerStarted","Data":"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280"} Feb 26 20:16:21 crc kubenswrapper[4722]: I0226 20:16:21.745768 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.74574185 podStartE2EDuration="2.74574185s" podCreationTimestamp="2026-02-26 20:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:21.739413688 +0000 UTC m=+1324.276381632" watchObservedRunningTime="2026-02-26 20:16:21.74574185 +0000 UTC m=+1324.282709784" Feb 26 20:16:22 crc kubenswrapper[4722]: I0226 20:16:22.732787 4722 generic.go:334] "Generic (PLEG): container finished" podID="85ac107a-489c-4551-a4ed-49cd15006d82" containerID="95f2ba448ff4845c41ed4591656eae80b72bbc42527cbc23fff03dbb497fffec" exitCode=0 Feb 26 20:16:22 crc kubenswrapper[4722]: I0226 20:16:22.732941 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" event={"ID":"85ac107a-489c-4551-a4ed-49cd15006d82","Type":"ContainerDied","Data":"95f2ba448ff4845c41ed4591656eae80b72bbc42527cbc23fff03dbb497fffec"} Feb 26 20:16:22 crc kubenswrapper[4722]: I0226 20:16:22.917436 4722 scope.go:117] "RemoveContainer" containerID="729c7c263fc0eb65734a08b008dc42681c8138323ad073790b9d94370f759560" Feb 26 20:16:22 crc kubenswrapper[4722]: I0226 20:16:22.989630 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:16:22 crc kubenswrapper[4722]: I0226 20:16:22.989678 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.027818 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.414041 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.414409 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.454751 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.472269 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.487595 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.487923 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.557592 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d9875b97-6blv4"] Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.557853 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerName="dnsmasq-dns" containerID="cri-o://0de296d1c6f3faa11ee9a2a5910d2c4b8e64c6796013674ed3e9c96393c3abe9" gracePeriod=10 Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.747438 4722 generic.go:334] "Generic (PLEG): container finished" podID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerID="0de296d1c6f3faa11ee9a2a5910d2c4b8e64c6796013674ed3e9c96393c3abe9" exitCode=0 Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.747517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" event={"ID":"fc52c422-c3c5-4b3d-81a3-57ee15cca146","Type":"ContainerDied","Data":"0de296d1c6f3faa11ee9a2a5910d2c4b8e64c6796013674ed3e9c96393c3abe9"} Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.752054 4722 generic.go:334] "Generic (PLEG): container finished" podID="19cd0379-1ef6-4db2-b900-2ca9efaf0452" containerID="afeb3c6c9d4df7a35b2c56ba06902433218933267ef411f3e596c6aee9e216c3" exitCode=0 Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.752232 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjxc5" event={"ID":"19cd0379-1ef6-4db2-b900-2ca9efaf0452","Type":"ContainerDied","Data":"afeb3c6c9d4df7a35b2c56ba06902433218933267ef411f3e596c6aee9e216c3"} Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.823245 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.077412 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.077453 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.334947 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.340700 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.413190 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-config\") pod \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.413228 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-svc\") pod \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414111 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-swift-storage-0\") pod \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414187 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzs62\" (UniqueName: \"kubernetes.io/projected/fc52c422-c3c5-4b3d-81a3-57ee15cca146-kube-api-access-zzs62\") pod \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414207 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f659\" (UniqueName: \"kubernetes.io/projected/85ac107a-489c-4551-a4ed-49cd15006d82-kube-api-access-9f659\") pod \"85ac107a-489c-4551-a4ed-49cd15006d82\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414286 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-config-data\") pod \"85ac107a-489c-4551-a4ed-49cd15006d82\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414306 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-sb\") pod \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-scripts\") pod \"85ac107a-489c-4551-a4ed-49cd15006d82\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414758 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-nb\") pod \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414824 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-combined-ca-bundle\") pod \"85ac107a-489c-4551-a4ed-49cd15006d82\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.420431 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-scripts" (OuterVolumeSpecName: "scripts") pod "85ac107a-489c-4551-a4ed-49cd15006d82" (UID: "85ac107a-489c-4551-a4ed-49cd15006d82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.422826 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc52c422-c3c5-4b3d-81a3-57ee15cca146-kube-api-access-zzs62" (OuterVolumeSpecName: "kube-api-access-zzs62") pod "fc52c422-c3c5-4b3d-81a3-57ee15cca146" (UID: "fc52c422-c3c5-4b3d-81a3-57ee15cca146"). InnerVolumeSpecName "kube-api-access-zzs62". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.438502 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ac107a-489c-4551-a4ed-49cd15006d82-kube-api-access-9f659" (OuterVolumeSpecName: "kube-api-access-9f659") pod "85ac107a-489c-4551-a4ed-49cd15006d82" (UID: "85ac107a-489c-4551-a4ed-49cd15006d82"). InnerVolumeSpecName "kube-api-access-9f659". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.486442 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc52c422-c3c5-4b3d-81a3-57ee15cca146" (UID: "fc52c422-c3c5-4b3d-81a3-57ee15cca146"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.493494 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-config-data" (OuterVolumeSpecName: "config-data") pod "85ac107a-489c-4551-a4ed-49cd15006d82" (UID: "85ac107a-489c-4551-a4ed-49cd15006d82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.494239 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85ac107a-489c-4551-a4ed-49cd15006d82" (UID: "85ac107a-489c-4551-a4ed-49cd15006d82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.496740 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-config" (OuterVolumeSpecName: "config") pod "fc52c422-c3c5-4b3d-81a3-57ee15cca146" (UID: "fc52c422-c3c5-4b3d-81a3-57ee15cca146"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.515704 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc52c422-c3c5-4b3d-81a3-57ee15cca146" (UID: "fc52c422-c3c5-4b3d-81a3-57ee15cca146"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517382 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517425 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517440 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517458 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzs62\" (UniqueName: \"kubernetes.io/projected/fc52c422-c3c5-4b3d-81a3-57ee15cca146-kube-api-access-zzs62\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517473 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f659\" (UniqueName: \"kubernetes.io/projected/85ac107a-489c-4551-a4ed-49cd15006d82-kube-api-access-9f659\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517487 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517501 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517513 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.549543 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc52c422-c3c5-4b3d-81a3-57ee15cca146" (UID: "fc52c422-c3c5-4b3d-81a3-57ee15cca146"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.558515 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc52c422-c3c5-4b3d-81a3-57ee15cca146" (UID: "fc52c422-c3c5-4b3d-81a3-57ee15cca146"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.621088 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.621161 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.765490 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" event={"ID":"85ac107a-489c-4551-a4ed-49cd15006d82","Type":"ContainerDied","Data":"c03ba754b0bbc1cbf8ff1c794006b811b3b5656eae21dce0d7c266d754df02e0"} Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.765535 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c03ba754b0bbc1cbf8ff1c794006b811b3b5656eae21dce0d7c266d754df02e0" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.765623 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.777071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" event={"ID":"fc52c422-c3c5-4b3d-81a3-57ee15cca146","Type":"ContainerDied","Data":"37302d91603306e61913bbe72a80e84ba2475c858ede9eaec79768fbeb23ef16"} Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.777146 4722 scope.go:117] "RemoveContainer" containerID="0de296d1c6f3faa11ee9a2a5910d2c4b8e64c6796013674ed3e9c96393c3abe9" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.777343 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.808852 4722 scope.go:117] "RemoveContainer" containerID="410c8bc811f8dc3b536538d081ec443c4b536a42a23ddcc9c1ed1f0f771b5206" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.830230 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d9875b97-6blv4"] Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.845695 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86d9875b97-6blv4"] Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.872379 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 20:16:24 crc kubenswrapper[4722]: E0226 20:16:24.872870 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ac107a-489c-4551-a4ed-49cd15006d82" containerName="nova-cell1-conductor-db-sync" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.872897 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ac107a-489c-4551-a4ed-49cd15006d82" containerName="nova-cell1-conductor-db-sync" Feb 26 20:16:24 crc kubenswrapper[4722]: E0226 20:16:24.872924 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerName="init" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.872931 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerName="init" Feb 26 20:16:24 crc kubenswrapper[4722]: E0226 20:16:24.872949 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerName="dnsmasq-dns" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.872955 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerName="dnsmasq-dns" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.873173 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerName="dnsmasq-dns" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.873195 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ac107a-489c-4551-a4ed-49cd15006d82" containerName="nova-cell1-conductor-db-sync" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.874081 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.879202 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.885077 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.926764 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.926891 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f6ns\" (UniqueName: \"kubernetes.io/projected/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-kube-api-access-6f6ns\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.926938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.028815 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f6ns\" (UniqueName: \"kubernetes.io/projected/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-kube-api-access-6f6ns\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.028909 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.029016 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.034444 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.045120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f6ns\" (UniqueName: \"kubernetes.io/projected/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-kube-api-access-6f6ns\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.046059 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.115301 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.115694 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.199759 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.280024 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.334625 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-combined-ca-bundle\") pod \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.335465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xwg7\" (UniqueName: \"kubernetes.io/projected/19cd0379-1ef6-4db2-b900-2ca9efaf0452-kube-api-access-6xwg7\") pod \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.335577 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-config-data\") pod \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.335634 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-scripts\") pod \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.343413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-scripts" (OuterVolumeSpecName: "scripts") pod "19cd0379-1ef6-4db2-b900-2ca9efaf0452" (UID: "19cd0379-1ef6-4db2-b900-2ca9efaf0452"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.345185 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19cd0379-1ef6-4db2-b900-2ca9efaf0452-kube-api-access-6xwg7" (OuterVolumeSpecName: "kube-api-access-6xwg7") pod "19cd0379-1ef6-4db2-b900-2ca9efaf0452" (UID: "19cd0379-1ef6-4db2-b900-2ca9efaf0452"). InnerVolumeSpecName "kube-api-access-6xwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.374865 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19cd0379-1ef6-4db2-b900-2ca9efaf0452" (UID: "19cd0379-1ef6-4db2-b900-2ca9efaf0452"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.379448 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-config-data" (OuterVolumeSpecName: "config-data") pod "19cd0379-1ef6-4db2-b900-2ca9efaf0452" (UID: "19cd0379-1ef6-4db2-b900-2ca9efaf0452"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.437690 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xwg7\" (UniqueName: \"kubernetes.io/projected/19cd0379-1ef6-4db2-b900-2ca9efaf0452-kube-api-access-6xwg7\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.437727 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.437736 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.437745 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.719893 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 20:16:25 crc kubenswrapper[4722]: W0226 20:16:25.720205 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24c715e3_32ab_4d06_b3d3_4ce8281bb54b.slice/crio-b3d996e1cd221a9f3f1742c25d40d51eef914e560a5afd9a590ba6e14d8a0ac9 WatchSource:0}: Error finding container b3d996e1cd221a9f3f1742c25d40d51eef914e560a5afd9a590ba6e14d8a0ac9: Status 404 returned error can't find the container with id b3d996e1cd221a9f3f1742c25d40d51eef914e560a5afd9a590ba6e14d8a0ac9 Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.787387 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjxc5" event={"ID":"19cd0379-1ef6-4db2-b900-2ca9efaf0452","Type":"ContainerDied","Data":"c889a3a4c6e9fb9743150bfd4f92b580d4a8ff043afbbdfe6fdde27ad56a8a45"} Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.787699 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c889a3a4c6e9fb9743150bfd4f92b580d4a8ff043afbbdfe6fdde27ad56a8a45" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.787416 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.788628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"24c715e3-32ab-4d06-b3d3-4ce8281bb54b","Type":"ContainerStarted","Data":"b3d996e1cd221a9f3f1742c25d40d51eef914e560a5afd9a590ba6e14d8a0ac9"} Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.915038 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.915267 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-log" containerID="cri-o://900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57" gracePeriod=30 Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.915658 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-api" containerID="cri-o://cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52" gracePeriod=30 Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.962320 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.976491 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.976939 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5f7a073a-d911-45e9-8a1d-75de83fa586e" containerName="nova-scheduler-scheduler" containerID="cri-o://87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658" gracePeriod=30 Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.156414 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" path="/var/lib/kubelet/pods/fc52c422-c3c5-4b3d-81a3-57ee15cca146/volumes" Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.802379 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"24c715e3-32ab-4d06-b3d3-4ce8281bb54b","Type":"ContainerStarted","Data":"6f0ae06c9811b2130b9fbedb5fbc2658cb5c0e9eb5bcac1e6a2a927b287be9de"} Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.802726 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.806034 4722 generic.go:334] "Generic (PLEG): container finished" podID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerID="900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57" exitCode=143 Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.806119 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458148a5-b954-49a8-81b8-5b5505dbd46c","Type":"ContainerDied","Data":"900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57"} Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.806319 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-log" containerID="cri-o://9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280" gracePeriod=30 Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.806359 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-metadata" containerID="cri-o://20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08" gracePeriod=30 Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.826690 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.8266710379999997 podStartE2EDuration="2.826671038s" podCreationTimestamp="2026-02-26 20:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:26.82490628 +0000 UTC m=+1329.361874244" watchObservedRunningTime="2026-02-26 20:16:26.826671038 +0000 UTC m=+1329.363638952" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.449415 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.473972 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-config-data\") pod \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.474042 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5mp5\" (UniqueName: \"kubernetes.io/projected/15cfff11-3c4a-4be4-b6b5-72544ea7a455-kube-api-access-x5mp5\") pod \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.474086 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-combined-ca-bundle\") pod \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.474212 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-nova-metadata-tls-certs\") pod \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.474303 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cfff11-3c4a-4be4-b6b5-72544ea7a455-logs\") pod \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.486213 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15cfff11-3c4a-4be4-b6b5-72544ea7a455-logs" (OuterVolumeSpecName: "logs") pod "15cfff11-3c4a-4be4-b6b5-72544ea7a455" (UID: "15cfff11-3c4a-4be4-b6b5-72544ea7a455"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.492351 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15cfff11-3c4a-4be4-b6b5-72544ea7a455-kube-api-access-x5mp5" (OuterVolumeSpecName: "kube-api-access-x5mp5") pod "15cfff11-3c4a-4be4-b6b5-72544ea7a455" (UID: "15cfff11-3c4a-4be4-b6b5-72544ea7a455"). InnerVolumeSpecName "kube-api-access-x5mp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.522929 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15cfff11-3c4a-4be4-b6b5-72544ea7a455" (UID: "15cfff11-3c4a-4be4-b6b5-72544ea7a455"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.527232 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-config-data" (OuterVolumeSpecName: "config-data") pod "15cfff11-3c4a-4be4-b6b5-72544ea7a455" (UID: "15cfff11-3c4a-4be4-b6b5-72544ea7a455"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.566610 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "15cfff11-3c4a-4be4-b6b5-72544ea7a455" (UID: "15cfff11-3c4a-4be4-b6b5-72544ea7a455"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.584164 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cfff11-3c4a-4be4-b6b5-72544ea7a455-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.584202 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.584219 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5mp5\" (UniqueName: \"kubernetes.io/projected/15cfff11-3c4a-4be4-b6b5-72544ea7a455-kube-api-access-x5mp5\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.584234 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.584247 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.819413 4722 generic.go:334] "Generic (PLEG): container finished" podID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerID="20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08" exitCode=0 Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.819452 4722 generic.go:334] "Generic (PLEG): container finished" podID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerID="9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280" exitCode=143 Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.819521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cfff11-3c4a-4be4-b6b5-72544ea7a455","Type":"ContainerDied","Data":"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08"} Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.819584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cfff11-3c4a-4be4-b6b5-72544ea7a455","Type":"ContainerDied","Data":"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280"} Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.819607 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cfff11-3c4a-4be4-b6b5-72544ea7a455","Type":"ContainerDied","Data":"b32b9f4dc0661f437f3dd3dd0ca79f7b5acc0e19867fb4fc59ec0609a2de7103"} Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.819636 4722 scope.go:117] "RemoveContainer" containerID="20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.820442 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.854033 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.855446 4722 scope.go:117] "RemoveContainer" containerID="9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.881408 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.892753 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:27 crc kubenswrapper[4722]: E0226 20:16:27.893534 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-metadata" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.893577 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-metadata" Feb 26 20:16:27 crc kubenswrapper[4722]: E0226 20:16:27.893639 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-log" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.893648 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-log" Feb 26 20:16:27 crc kubenswrapper[4722]: E0226 20:16:27.893661 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19cd0379-1ef6-4db2-b900-2ca9efaf0452" containerName="nova-manage" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.893669 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="19cd0379-1ef6-4db2-b900-2ca9efaf0452" containerName="nova-manage" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.893991 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="19cd0379-1ef6-4db2-b900-2ca9efaf0452" containerName="nova-manage" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.894042 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-metadata" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.894070 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-log" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.895817 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.897644 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.898578 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.905733 4722 scope.go:117] "RemoveContainer" containerID="20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08" Feb 26 20:16:27 crc kubenswrapper[4722]: E0226 20:16:27.910157 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08\": container with ID starting with 20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08 not found: ID does not exist" containerID="20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.910199 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08"} err="failed to get container status \"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08\": rpc error: code = NotFound desc = could not find container \"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08\": container with ID starting with 20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08 not found: ID does not exist" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.910224 4722 scope.go:117] "RemoveContainer" containerID="9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280" Feb 26 20:16:27 crc kubenswrapper[4722]: E0226 20:16:27.910644 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280\": container with ID starting with 9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280 not found: ID does not exist" containerID="9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.910667 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280"} err="failed to get container status \"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280\": rpc error: code = NotFound desc = could not find container \"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280\": container with ID starting with 9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280 not found: ID does not exist" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.910684 4722 scope.go:117] "RemoveContainer" containerID="20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.910964 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08"} err="failed to get container status \"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08\": rpc error: code = NotFound desc = could not find container \"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08\": container with ID starting with 20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08 not found: ID does not exist" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.910983 4722 scope.go:117] "RemoveContainer" containerID="9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.912604 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280"} err="failed to get container status \"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280\": rpc error: code = NotFound desc = could not find container \"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280\": container with ID starting with 9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280 not found: ID does not exist" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.924685 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.991353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.991426 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dbf936-bb20-4a48-a17c-4814f49ffddd-logs\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.991480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-config-data\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.991499 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkzcf\" (UniqueName: \"kubernetes.io/projected/11dbf936-bb20-4a48-a17c-4814f49ffddd-kube-api-access-nkzcf\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.991539 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.093409 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dbf936-bb20-4a48-a17c-4814f49ffddd-logs\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.093768 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-config-data\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.093888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkzcf\" (UniqueName: \"kubernetes.io/projected/11dbf936-bb20-4a48-a17c-4814f49ffddd-kube-api-access-nkzcf\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.093973 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.093981 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dbf936-bb20-4a48-a17c-4814f49ffddd-logs\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.094305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.098316 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.098366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.114790 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-config-data\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.119182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkzcf\" (UniqueName: \"kubernetes.io/projected/11dbf936-bb20-4a48-a17c-4814f49ffddd-kube-api-access-nkzcf\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.161236 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" path="/var/lib/kubelet/pods/15cfff11-3c4a-4be4-b6b5-72544ea7a455/volumes" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.225692 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: E0226 20:16:28.420179 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 20:16:28 crc kubenswrapper[4722]: E0226 20:16:28.423610 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 20:16:28 crc kubenswrapper[4722]: E0226 20:16:28.425722 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 20:16:28 crc kubenswrapper[4722]: E0226 20:16:28.425802 4722 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5f7a073a-d911-45e9-8a1d-75de83fa586e" containerName="nova-scheduler-scheduler" Feb 26 20:16:28 crc kubenswrapper[4722]: W0226 20:16:28.714073 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11dbf936_bb20_4a48_a17c_4814f49ffddd.slice/crio-99b45ccd7259dc6971651474be7d9bbc67d18a6f7cb66455f79f3ed0b70a8a14 WatchSource:0}: Error finding container 99b45ccd7259dc6971651474be7d9bbc67d18a6f7cb66455f79f3ed0b70a8a14: Status 404 returned error can't find the container with id 99b45ccd7259dc6971651474be7d9bbc67d18a6f7cb66455f79f3ed0b70a8a14 Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.723488 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.831075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11dbf936-bb20-4a48-a17c-4814f49ffddd","Type":"ContainerStarted","Data":"99b45ccd7259dc6971651474be7d9bbc67d18a6f7cb66455f79f3ed0b70a8a14"} Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.642867 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.684298 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.726271 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-combined-ca-bundle\") pod \"458148a5-b954-49a8-81b8-5b5505dbd46c\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.726357 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458148a5-b954-49a8-81b8-5b5505dbd46c-logs\") pod \"458148a5-b954-49a8-81b8-5b5505dbd46c\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.726399 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzhtn\" (UniqueName: \"kubernetes.io/projected/458148a5-b954-49a8-81b8-5b5505dbd46c-kube-api-access-jzhtn\") pod \"458148a5-b954-49a8-81b8-5b5505dbd46c\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.726647 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data\") pod \"458148a5-b954-49a8-81b8-5b5505dbd46c\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.727442 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/458148a5-b954-49a8-81b8-5b5505dbd46c-logs" (OuterVolumeSpecName: "logs") pod "458148a5-b954-49a8-81b8-5b5505dbd46c" (UID: "458148a5-b954-49a8-81b8-5b5505dbd46c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.736021 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/458148a5-b954-49a8-81b8-5b5505dbd46c-kube-api-access-jzhtn" (OuterVolumeSpecName: "kube-api-access-jzhtn") pod "458148a5-b954-49a8-81b8-5b5505dbd46c" (UID: "458148a5-b954-49a8-81b8-5b5505dbd46c"). InnerVolumeSpecName "kube-api-access-jzhtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:29 crc kubenswrapper[4722]: E0226 20:16:29.759477 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data podName:458148a5-b954-49a8-81b8-5b5505dbd46c nodeName:}" failed. No retries permitted until 2026-02-26 20:16:30.259451617 +0000 UTC m=+1332.796419541 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data") pod "458148a5-b954-49a8-81b8-5b5505dbd46c" (UID: "458148a5-b954-49a8-81b8-5b5505dbd46c") : error deleting /var/lib/kubelet/pods/458148a5-b954-49a8-81b8-5b5505dbd46c/volume-subpaths: remove /var/lib/kubelet/pods/458148a5-b954-49a8-81b8-5b5505dbd46c/volume-subpaths: no such file or directory Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.762832 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "458148a5-b954-49a8-81b8-5b5505dbd46c" (UID: "458148a5-b954-49a8-81b8-5b5505dbd46c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.830667 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzhtn\" (UniqueName: \"kubernetes.io/projected/458148a5-b954-49a8-81b8-5b5505dbd46c-kube-api-access-jzhtn\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.830703 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.830716 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458148a5-b954-49a8-81b8-5b5505dbd46c-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.842429 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11dbf936-bb20-4a48-a17c-4814f49ffddd","Type":"ContainerStarted","Data":"57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65"} Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.842474 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11dbf936-bb20-4a48-a17c-4814f49ffddd","Type":"ContainerStarted","Data":"4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047"} Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.843845 4722 generic.go:334] "Generic (PLEG): container finished" podID="5f7a073a-d911-45e9-8a1d-75de83fa586e" containerID="87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658" exitCode=0 Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.843912 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f7a073a-d911-45e9-8a1d-75de83fa586e","Type":"ContainerDied","Data":"87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658"} Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.845382 4722 generic.go:334] "Generic (PLEG): container finished" podID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerID="cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52" exitCode=0 Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.845416 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458148a5-b954-49a8-81b8-5b5505dbd46c","Type":"ContainerDied","Data":"cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52"} Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.845433 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458148a5-b954-49a8-81b8-5b5505dbd46c","Type":"ContainerDied","Data":"cce2670f3da4c5ee06b06b9e0a4e5eff97452bf4c62188109c79c282ca267fdf"} Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.845448 4722 scope.go:117] "RemoveContainer" containerID="cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.845556 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.868204 4722 scope.go:117] "RemoveContainer" containerID="900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.869989 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.869964079 podStartE2EDuration="2.869964079s" podCreationTimestamp="2026-02-26 20:16:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:29.860549722 +0000 UTC m=+1332.397517646" watchObservedRunningTime="2026-02-26 20:16:29.869964079 +0000 UTC m=+1332.406932013" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.888288 4722 scope.go:117] "RemoveContainer" containerID="cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52" Feb 26 20:16:29 crc kubenswrapper[4722]: E0226 20:16:29.888771 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52\": container with ID starting with cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52 not found: ID does not exist" containerID="cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.888827 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52"} err="failed to get container status \"cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52\": rpc error: code = NotFound desc = could not find container \"cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52\": container with ID starting with cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52 not found: ID does not exist" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.888856 4722 scope.go:117] "RemoveContainer" containerID="900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57" Feb 26 20:16:29 crc kubenswrapper[4722]: E0226 20:16:29.889178 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57\": container with ID starting with 900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57 not found: ID does not exist" containerID="900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.889202 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57"} err="failed to get container status \"900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57\": rpc error: code = NotFound desc = could not find container \"900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57\": container with ID starting with 900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57 not found: ID does not exist" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.908837 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.036793 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-combined-ca-bundle\") pod \"5f7a073a-d911-45e9-8a1d-75de83fa586e\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.036863 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck5pq\" (UniqueName: \"kubernetes.io/projected/5f7a073a-d911-45e9-8a1d-75de83fa586e-kube-api-access-ck5pq\") pod \"5f7a073a-d911-45e9-8a1d-75de83fa586e\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.036921 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-config-data\") pod \"5f7a073a-d911-45e9-8a1d-75de83fa586e\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.054337 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f7a073a-d911-45e9-8a1d-75de83fa586e-kube-api-access-ck5pq" (OuterVolumeSpecName: "kube-api-access-ck5pq") pod "5f7a073a-d911-45e9-8a1d-75de83fa586e" (UID: "5f7a073a-d911-45e9-8a1d-75de83fa586e"). InnerVolumeSpecName "kube-api-access-ck5pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.090733 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-config-data" (OuterVolumeSpecName: "config-data") pod "5f7a073a-d911-45e9-8a1d-75de83fa586e" (UID: "5f7a073a-d911-45e9-8a1d-75de83fa586e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.100082 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f7a073a-d911-45e9-8a1d-75de83fa586e" (UID: "5f7a073a-d911-45e9-8a1d-75de83fa586e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.165259 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.165286 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck5pq\" (UniqueName: \"kubernetes.io/projected/5f7a073a-d911-45e9-8a1d-75de83fa586e-kube-api-access-ck5pq\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.165295 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.234189 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.266712 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data\") pod \"458148a5-b954-49a8-81b8-5b5505dbd46c\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.273879 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data" (OuterVolumeSpecName: "config-data") pod "458148a5-b954-49a8-81b8-5b5505dbd46c" (UID: "458148a5-b954-49a8-81b8-5b5505dbd46c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.369687 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.480845 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.490694 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.504107 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:30 crc kubenswrapper[4722]: E0226 20:16:30.504848 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-api" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.504869 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-api" Feb 26 20:16:30 crc kubenswrapper[4722]: E0226 20:16:30.504885 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f7a073a-d911-45e9-8a1d-75de83fa586e" containerName="nova-scheduler-scheduler" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.504892 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f7a073a-d911-45e9-8a1d-75de83fa586e" containerName="nova-scheduler-scheduler" Feb 26 20:16:30 crc kubenswrapper[4722]: E0226 20:16:30.504910 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-log" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.504916 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-log" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.505124 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f7a073a-d911-45e9-8a1d-75de83fa586e" containerName="nova-scheduler-scheduler" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.505191 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-log" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.505202 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-api" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.506369 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.508646 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.527746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.578301 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c41fb8-288c-4c58-b50c-2b253d825fee-logs\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.578693 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-config-data\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.578727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.578752 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56lpq\" (UniqueName: \"kubernetes.io/projected/f3c41fb8-288c-4c58-b50c-2b253d825fee-kube-api-access-56lpq\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.681300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-config-data\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.681352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.681379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56lpq\" (UniqueName: \"kubernetes.io/projected/f3c41fb8-288c-4c58-b50c-2b253d825fee-kube-api-access-56lpq\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.681486 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c41fb8-288c-4c58-b50c-2b253d825fee-logs\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.682084 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c41fb8-288c-4c58-b50c-2b253d825fee-logs\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.686480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-config-data\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.686710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.711157 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56lpq\" (UniqueName: \"kubernetes.io/projected/f3c41fb8-288c-4c58-b50c-2b253d825fee-kube-api-access-56lpq\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.825540 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.879087 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f7a073a-d911-45e9-8a1d-75de83fa586e","Type":"ContainerDied","Data":"50354850fd29dab3642698b484efae5930600590c0af98c64e5d3b26302f0f06"} Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.879123 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.879175 4722 scope.go:117] "RemoveContainer" containerID="87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.962562 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.994300 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.016457 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.018308 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.020777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.027100 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.092284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgqzc\" (UniqueName: \"kubernetes.io/projected/3c669562-253c-4085-9e5c-04dfd8ae4338-kube-api-access-pgqzc\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.095855 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-config-data\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.095913 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.198095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-config-data\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.198160 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.198229 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgqzc\" (UniqueName: \"kubernetes.io/projected/3c669562-253c-4085-9e5c-04dfd8ae4338-kube-api-access-pgqzc\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.205759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-config-data\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.206105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.222843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgqzc\" (UniqueName: \"kubernetes.io/projected/3c669562-253c-4085-9e5c-04dfd8ae4338-kube-api-access-pgqzc\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: W0226 20:16:31.340219 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3c41fb8_288c_4c58_b50c_2b253d825fee.slice/crio-afde955f5adee5fe07eeceb7c91cd2bc684d0aa3d81faf072b6c75b19cc8a595 WatchSource:0}: Error finding container afde955f5adee5fe07eeceb7c91cd2bc684d0aa3d81faf072b6c75b19cc8a595: Status 404 returned error can't find the container with id afde955f5adee5fe07eeceb7c91cd2bc684d0aa3d81faf072b6c75b19cc8a595 Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.341011 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.341705 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.959816 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3c41fb8-288c-4c58-b50c-2b253d825fee","Type":"ContainerStarted","Data":"afde955f5adee5fe07eeceb7c91cd2bc684d0aa3d81faf072b6c75b19cc8a595"} Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.158837 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" path="/var/lib/kubelet/pods/458148a5-b954-49a8-81b8-5b5505dbd46c/volumes" Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.159446 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f7a073a-d911-45e9-8a1d-75de83fa586e" path="/var/lib/kubelet/pods/5f7a073a-d911-45e9-8a1d-75de83fa586e/volumes" Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.189784 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:32 crc kubenswrapper[4722]: W0226 20:16:32.189873 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c669562_253c_4085_9e5c_04dfd8ae4338.slice/crio-8686a45b85265024b1d70abf9cccd3404c423334af28bd17d8f02b40dd77263e WatchSource:0}: Error finding container 8686a45b85265024b1d70abf9cccd3404c423334af28bd17d8f02b40dd77263e: Status 404 returned error can't find the container with id 8686a45b85265024b1d70abf9cccd3404c423334af28bd17d8f02b40dd77263e Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.971191 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3c41fb8-288c-4c58-b50c-2b253d825fee","Type":"ContainerStarted","Data":"39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d"} Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.971499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3c41fb8-288c-4c58-b50c-2b253d825fee","Type":"ContainerStarted","Data":"8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9"} Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.973491 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c669562-253c-4085-9e5c-04dfd8ae4338","Type":"ContainerStarted","Data":"9f0943fe618a3b57a0ed929e5884131cb9a6db2dc795d2418858e3a142665492"} Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.973553 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c669562-253c-4085-9e5c-04dfd8ae4338","Type":"ContainerStarted","Data":"8686a45b85265024b1d70abf9cccd3404c423334af28bd17d8f02b40dd77263e"} Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.988415 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.988394848 podStartE2EDuration="2.988394848s" podCreationTimestamp="2026-02-26 20:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:32.987199556 +0000 UTC m=+1335.524167480" watchObservedRunningTime="2026-02-26 20:16:32.988394848 +0000 UTC m=+1335.525362782" Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.012055 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.012034852 podStartE2EDuration="3.012034852s" podCreationTimestamp="2026-02-26 20:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:33.008764303 +0000 UTC m=+1335.545732227" watchObservedRunningTime="2026-02-26 20:16:33.012034852 +0000 UTC m=+1335.549002776" Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.227183 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.227234 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.626516 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.626754 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" containerName="kube-state-metrics" containerID="cri-o://1478fa74c7d3ac1319ea01b47e6b8771ed24b3cc47e5513578cbb247ebf864ba" gracePeriod=30 Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.985030 4722 generic.go:334] "Generic (PLEG): container finished" podID="e6617222-c81a-46cc-9c98-1170f7c89846" containerID="1478fa74c7d3ac1319ea01b47e6b8771ed24b3cc47e5513578cbb247ebf864ba" exitCode=2 Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.985190 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e6617222-c81a-46cc-9c98-1170f7c89846","Type":"ContainerDied","Data":"1478fa74c7d3ac1319ea01b47e6b8771ed24b3cc47e5513578cbb247ebf864ba"} Feb 26 20:16:34 crc kubenswrapper[4722]: I0226 20:16:34.197693 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 20:16:34 crc kubenswrapper[4722]: I0226 20:16:34.282574 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnwp8\" (UniqueName: \"kubernetes.io/projected/e6617222-c81a-46cc-9c98-1170f7c89846-kube-api-access-rnwp8\") pod \"e6617222-c81a-46cc-9c98-1170f7c89846\" (UID: \"e6617222-c81a-46cc-9c98-1170f7c89846\") " Feb 26 20:16:34 crc kubenswrapper[4722]: I0226 20:16:34.288379 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6617222-c81a-46cc-9c98-1170f7c89846-kube-api-access-rnwp8" (OuterVolumeSpecName: "kube-api-access-rnwp8") pod "e6617222-c81a-46cc-9c98-1170f7c89846" (UID: "e6617222-c81a-46cc-9c98-1170f7c89846"). InnerVolumeSpecName "kube-api-access-rnwp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:34 crc kubenswrapper[4722]: I0226 20:16:34.386032 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnwp8\" (UniqueName: \"kubernetes.io/projected/e6617222-c81a-46cc-9c98-1170f7c89846-kube-api-access-rnwp8\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:34.999729 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e6617222-c81a-46cc-9c98-1170f7c89846","Type":"ContainerDied","Data":"4c5c905412b487d64b54a6c3d784b133430d8947b0b99214d7dbe7ea6a0f0b96"} Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:34.999776 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.000050 4722 scope.go:117] "RemoveContainer" containerID="1478fa74c7d3ac1319ea01b47e6b8771ed24b3cc47e5513578cbb247ebf864ba" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.038553 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.051095 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.070266 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:16:35 crc kubenswrapper[4722]: E0226 20:16:35.070715 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" containerName="kube-state-metrics" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.070730 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" containerName="kube-state-metrics" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.070937 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" containerName="kube-state-metrics" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.071795 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.078054 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.078423 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.087977 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.098500 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.098615 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.098685 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.098712 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbwgf\" (UniqueName: \"kubernetes.io/projected/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-api-access-dbwgf\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.200570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.200814 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.201006 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.201061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbwgf\" (UniqueName: \"kubernetes.io/projected/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-api-access-dbwgf\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.206164 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.206223 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.208319 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.220170 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbwgf\" (UniqueName: \"kubernetes.io/projected/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-api-access-dbwgf\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.395367 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.881499 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.882118 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="proxy-httpd" containerID="cri-o://7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d" gracePeriod=30 Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.882162 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="sg-core" containerID="cri-o://d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747" gracePeriod=30 Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.882191 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-notification-agent" containerID="cri-o://5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a" gracePeriod=30 Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.882250 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-central-agent" containerID="cri-o://171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a" gracePeriod=30 Feb 26 20:16:36 crc kubenswrapper[4722]: I0226 20:16:36.058665 4722 generic.go:334] "Generic (PLEG): container finished" podID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerID="d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747" exitCode=2 Feb 26 20:16:36 crc kubenswrapper[4722]: I0226 20:16:36.058972 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerDied","Data":"d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747"} Feb 26 20:16:36 crc kubenswrapper[4722]: I0226 20:16:36.059528 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:16:36 crc kubenswrapper[4722]: I0226 20:16:36.167083 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" path="/var/lib/kubelet/pods/e6617222-c81a-46cc-9c98-1170f7c89846/volumes" Feb 26 20:16:36 crc kubenswrapper[4722]: I0226 20:16:36.342174 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.080568 4722 generic.go:334] "Generic (PLEG): container finished" podID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerID="7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d" exitCode=0 Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.080831 4722 generic.go:334] "Generic (PLEG): container finished" podID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerID="171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a" exitCode=0 Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.080660 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerDied","Data":"7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d"} Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.080885 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerDied","Data":"171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a"} Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.083989 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6e07189c-f69a-4914-8fe7-efbdcf3c5882","Type":"ContainerStarted","Data":"8a1aea0de6e9b68822aa8f1c6da79532e3eb500eaaa75046f307eea3d1ca7f7f"} Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.084028 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6e07189c-f69a-4914-8fe7-efbdcf3c5882","Type":"ContainerStarted","Data":"859425f0a04f6ff829a9381399a92d3eea9ca5382579b151ca9453832dd6cde8"} Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.085276 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.104086 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.740496904 podStartE2EDuration="2.104067532s" podCreationTimestamp="2026-02-26 20:16:35 +0000 UTC" firstStartedPulling="2026-02-26 20:16:36.05631164 +0000 UTC m=+1338.593279564" lastFinishedPulling="2026-02-26 20:16:36.419882268 +0000 UTC m=+1338.956850192" observedRunningTime="2026-02-26 20:16:37.099407666 +0000 UTC m=+1339.636375600" watchObservedRunningTime="2026-02-26 20:16:37.104067532 +0000 UTC m=+1339.641035456" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.227462 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.227873 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.723092 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895015 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzx4n\" (UniqueName: \"kubernetes.io/projected/6155bd98-22a4-476d-9572-8f172f4e8cc2-kube-api-access-vzx4n\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895532 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-combined-ca-bundle\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895646 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-log-httpd\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895704 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-scripts\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895776 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-config-data\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895833 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-sg-core-conf-yaml\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895956 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-run-httpd\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.898027 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.898561 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.901961 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-scripts" (OuterVolumeSpecName: "scripts") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.910205 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6155bd98-22a4-476d-9572-8f172f4e8cc2-kube-api-access-vzx4n" (OuterVolumeSpecName: "kube-api-access-vzx4n") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "kube-api-access-vzx4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.946639 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.000726 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzx4n\" (UniqueName: \"kubernetes.io/projected/6155bd98-22a4-476d-9572-8f172f4e8cc2-kube-api-access-vzx4n\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.000766 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.000778 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.000789 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.000803 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.010056 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.055329 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-config-data" (OuterVolumeSpecName: "config-data") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.101854 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.101884 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.106415 4722 generic.go:334] "Generic (PLEG): container finished" podID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerID="5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a" exitCode=0 Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.106493 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerDied","Data":"5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a"} Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.106551 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerDied","Data":"8c9cff63477e020d84078860f2efca3214d03c36290d6495bf75e0fc3f652072"} Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.106568 4722 scope.go:117] "RemoveContainer" containerID="7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.106517 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.133802 4722 scope.go:117] "RemoveContainer" containerID="d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.184941 4722 scope.go:117] "RemoveContainer" containerID="5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.192937 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.235935 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.248339 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.248656 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.250193 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.250858 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="sg-core" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.250880 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="sg-core" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.250898 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="proxy-httpd" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.250903 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="proxy-httpd" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.250920 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-central-agent" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.250926 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-central-agent" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.250943 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-notification-agent" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.250949 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-notification-agent" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.251125 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="sg-core" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.251159 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="proxy-httpd" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.251174 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-notification-agent" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.251191 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-central-agent" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.259408 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.261478 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.261814 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.261968 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.263888 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.269317 4722 scope.go:117] "RemoveContainer" containerID="171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.293359 4722 scope.go:117] "RemoveContainer" containerID="7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.294036 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d\": container with ID starting with 7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d not found: ID does not exist" containerID="7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.294069 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d"} err="failed to get container status \"7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d\": rpc error: code = NotFound desc = could not find container \"7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d\": container with ID starting with 7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d not found: ID does not exist" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.294090 4722 scope.go:117] "RemoveContainer" containerID="d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.294815 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747\": container with ID starting with d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747 not found: ID does not exist" containerID="d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.294870 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747"} err="failed to get container status \"d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747\": rpc error: code = NotFound desc = could not find container \"d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747\": container with ID starting with d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747 not found: ID does not exist" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.294901 4722 scope.go:117] "RemoveContainer" containerID="5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.295640 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a\": container with ID starting with 5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a not found: ID does not exist" containerID="5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.295682 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a"} err="failed to get container status \"5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a\": rpc error: code = NotFound desc = could not find container \"5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a\": container with ID starting with 5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a not found: ID does not exist" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.295711 4722 scope.go:117] "RemoveContainer" containerID="171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.296105 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a\": container with ID starting with 171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a not found: ID does not exist" containerID="171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.296129 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a"} err="failed to get container status \"171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a\": rpc error: code = NotFound desc = could not find container \"171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a\": container with ID starting with 171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a not found: ID does not exist" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.312770 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-log-httpd\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313032 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-run-httpd\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313152 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313265 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-scripts\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313545 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv6hg\" (UniqueName: \"kubernetes.io/projected/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-kube-api-access-zv6hg\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313580 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-config-data\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414456 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-config-data\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414503 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414598 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-log-httpd\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414644 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414670 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-run-httpd\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414697 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414736 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-scripts\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414825 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv6hg\" (UniqueName: \"kubernetes.io/projected/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-kube-api-access-zv6hg\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.415502 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-log-httpd\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.415530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-run-httpd\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.419846 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.419862 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-scripts\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.420939 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.421639 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.422259 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-config-data\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.433924 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv6hg\" (UniqueName: \"kubernetes.io/projected/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-kube-api-access-zv6hg\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.577593 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:40 crc kubenswrapper[4722]: I0226 20:16:40.032641 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:40 crc kubenswrapper[4722]: W0226 20:16:40.036692 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod831065e5_8f9c_4cc4_bffa_e2d82a3a2244.slice/crio-6488ce6ddcf79c8ae6166c4b8741984494dd8675bb7d217b88703c0468d5bf86 WatchSource:0}: Error finding container 6488ce6ddcf79c8ae6166c4b8741984494dd8675bb7d217b88703c0468d5bf86: Status 404 returned error can't find the container with id 6488ce6ddcf79c8ae6166c4b8741984494dd8675bb7d217b88703c0468d5bf86 Feb 26 20:16:40 crc kubenswrapper[4722]: I0226 20:16:40.116901 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerStarted","Data":"6488ce6ddcf79c8ae6166c4b8741984494dd8675bb7d217b88703c0468d5bf86"} Feb 26 20:16:40 crc kubenswrapper[4722]: I0226 20:16:40.157010 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" path="/var/lib/kubelet/pods/6155bd98-22a4-476d-9572-8f172f4e8cc2/volumes" Feb 26 20:16:40 crc kubenswrapper[4722]: I0226 20:16:40.826743 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:16:40 crc kubenswrapper[4722]: I0226 20:16:40.827101 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:16:41 crc kubenswrapper[4722]: I0226 20:16:41.130645 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerStarted","Data":"f01f82f56aa71e36fde7f145c0385eaeae4d5965d7203a968f0c46de4fc9007f"} Feb 26 20:16:41 crc kubenswrapper[4722]: I0226 20:16:41.343100 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 20:16:41 crc kubenswrapper[4722]: I0226 20:16:41.390186 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 20:16:41 crc kubenswrapper[4722]: I0226 20:16:41.908318 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.225:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:16:41 crc kubenswrapper[4722]: I0226 20:16:41.908385 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.225:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:16:42 crc kubenswrapper[4722]: I0226 20:16:42.141764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerStarted","Data":"5db80ecd0befa58413f7c6482fd136ca4482c2fc6416b1c75f972078eb904c64"} Feb 26 20:16:42 crc kubenswrapper[4722]: I0226 20:16:42.141809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerStarted","Data":"a8a64995955c40401bcb9aef86c4a826c4b3e609c9b33585a2c290a888961e6a"} Feb 26 20:16:42 crc kubenswrapper[4722]: I0226 20:16:42.177659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 20:16:45 crc kubenswrapper[4722]: I0226 20:16:45.196623 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerStarted","Data":"c084508788b791501c6bf49e27226df5172fcd49a4ad5c814670c2a71681c96a"} Feb 26 20:16:45 crc kubenswrapper[4722]: I0226 20:16:45.197389 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:16:45 crc kubenswrapper[4722]: I0226 20:16:45.227440 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.176069166 podStartE2EDuration="6.227417647s" podCreationTimestamp="2026-02-26 20:16:39 +0000 UTC" firstStartedPulling="2026-02-26 20:16:40.039055382 +0000 UTC m=+1342.576023306" lastFinishedPulling="2026-02-26 20:16:44.090403863 +0000 UTC m=+1346.627371787" observedRunningTime="2026-02-26 20:16:45.219761819 +0000 UTC m=+1347.756729813" watchObservedRunningTime="2026-02-26 20:16:45.227417647 +0000 UTC m=+1347.764385581" Feb 26 20:16:45 crc kubenswrapper[4722]: I0226 20:16:45.412319 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 26 20:16:48 crc kubenswrapper[4722]: I0226 20:16:48.233625 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 20:16:48 crc kubenswrapper[4722]: I0226 20:16:48.234613 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 20:16:48 crc kubenswrapper[4722]: I0226 20:16:48.239858 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 20:16:48 crc kubenswrapper[4722]: E0226 20:16:48.825109 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda39c3d27_7241_4634_87af_841ab87e17c0.slice/crio-8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda39c3d27_7241_4634_87af_841ab87e17c0.slice/crio-conmon-8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.164036 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.240697 4722 generic.go:334] "Generic (PLEG): container finished" podID="a39c3d27-7241-4634-87af-841ab87e17c0" containerID="8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460" exitCode=137 Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.240748 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.240769 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a39c3d27-7241-4634-87af-841ab87e17c0","Type":"ContainerDied","Data":"8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460"} Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.242104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a39c3d27-7241-4634-87af-841ab87e17c0","Type":"ContainerDied","Data":"bcc64080597b2ae7a7214cbe36c9c6e88ca6123db9749e8dfafd7532df58e64d"} Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.242214 4722 scope.go:117] "RemoveContainer" containerID="8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.250071 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.288582 4722 scope.go:117] "RemoveContainer" containerID="8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460" Feb 26 20:16:49 crc kubenswrapper[4722]: E0226 20:16:49.289250 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460\": container with ID starting with 8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460 not found: ID does not exist" containerID="8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.289307 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460"} err="failed to get container status \"8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460\": rpc error: code = NotFound desc = could not find container \"8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460\": container with ID starting with 8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460 not found: ID does not exist" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.330188 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkqtv\" (UniqueName: \"kubernetes.io/projected/a39c3d27-7241-4634-87af-841ab87e17c0-kube-api-access-xkqtv\") pod \"a39c3d27-7241-4634-87af-841ab87e17c0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.330244 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-combined-ca-bundle\") pod \"a39c3d27-7241-4634-87af-841ab87e17c0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.330440 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-config-data\") pod \"a39c3d27-7241-4634-87af-841ab87e17c0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.340701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a39c3d27-7241-4634-87af-841ab87e17c0-kube-api-access-xkqtv" (OuterVolumeSpecName: "kube-api-access-xkqtv") pod "a39c3d27-7241-4634-87af-841ab87e17c0" (UID: "a39c3d27-7241-4634-87af-841ab87e17c0"). InnerVolumeSpecName "kube-api-access-xkqtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.358825 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-config-data" (OuterVolumeSpecName: "config-data") pod "a39c3d27-7241-4634-87af-841ab87e17c0" (UID: "a39c3d27-7241-4634-87af-841ab87e17c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.368889 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a39c3d27-7241-4634-87af-841ab87e17c0" (UID: "a39c3d27-7241-4634-87af-841ab87e17c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.435389 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.435423 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkqtv\" (UniqueName: \"kubernetes.io/projected/a39c3d27-7241-4634-87af-841ab87e17c0-kube-api-access-xkqtv\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.435440 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.577122 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.595744 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.614157 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:49 crc kubenswrapper[4722]: E0226 20:16:49.614598 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a39c3d27-7241-4634-87af-841ab87e17c0" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.614610 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a39c3d27-7241-4634-87af-841ab87e17c0" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.614820 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a39c3d27-7241-4634-87af-841ab87e17c0" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.615486 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.615560 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.637893 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.637995 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.640689 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.739798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.739897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.739937 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.739985 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gk84\" (UniqueName: \"kubernetes.io/projected/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-kube-api-access-5gk84\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.740097 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.842962 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.843109 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.843320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gk84\" (UniqueName: \"kubernetes.io/projected/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-kube-api-access-5gk84\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.843399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.843537 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.846536 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.846898 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.847534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.850446 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.862501 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gk84\" (UniqueName: \"kubernetes.io/projected/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-kube-api-access-5gk84\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.953797 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.164352 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a39c3d27-7241-4634-87af-841ab87e17c0" path="/var/lib/kubelet/pods/a39c3d27-7241-4634-87af-841ab87e17c0/volumes" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.425932 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.831006 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.831377 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.831698 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.831933 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.835608 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.840205 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.065650 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78468d7767-275dc"] Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.067717 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.079273 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78468d7767-275dc"] Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.084930 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-config\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.084982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btnm2\" (UniqueName: \"kubernetes.io/projected/3daa70c7-4339-4dad-8531-4e9772dca52d-kube-api-access-btnm2\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.085021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-sb\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.085051 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-swift-storage-0\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.085073 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-svc\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.085144 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-nb\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.192756 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-config\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.192813 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btnm2\" (UniqueName: \"kubernetes.io/projected/3daa70c7-4339-4dad-8531-4e9772dca52d-kube-api-access-btnm2\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.192855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-sb\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.192890 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-swift-storage-0\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.192913 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-svc\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.192973 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-nb\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.193546 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-config\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.194010 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-swift-storage-0\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.194050 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-nb\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.194171 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-sb\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.194190 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-svc\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.234003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btnm2\" (UniqueName: \"kubernetes.io/projected/3daa70c7-4339-4dad-8531-4e9772dca52d-kube-api-access-btnm2\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.266978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ea10c214-f090-4ada-b1dd-ec1e9a153fb1","Type":"ContainerStarted","Data":"41e81541b2e9fed79a0e6de5102ec075068986757963be8206acafb55dd7d487"} Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.267053 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ea10c214-f090-4ada-b1dd-ec1e9a153fb1","Type":"ContainerStarted","Data":"55f172526018b9886aa5f15cfe1b8b31b8ef0ca91dc8a6a0bf6904373b078c13"} Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.294739 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.294718824 podStartE2EDuration="2.294718824s" podCreationTimestamp="2026-02-26 20:16:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:51.282757188 +0000 UTC m=+1353.819725132" watchObservedRunningTime="2026-02-26 20:16:51.294718824 +0000 UTC m=+1353.831686748" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.393341 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.996363 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78468d7767-275dc"] Feb 26 20:16:52 crc kubenswrapper[4722]: I0226 20:16:52.277169 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78468d7767-275dc" event={"ID":"3daa70c7-4339-4dad-8531-4e9772dca52d","Type":"ContainerStarted","Data":"5aa06895449e8178118801bc34ee6a228ece2474fb523cfc5dcb8d816767e6f8"} Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.287193 4722 generic.go:334] "Generic (PLEG): container finished" podID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerID="70da35dea19ed1e6b7bc1057598c17e82450bd4aa8e04b6db6ad8e73115c2027" exitCode=0 Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.287293 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78468d7767-275dc" event={"ID":"3daa70c7-4339-4dad-8531-4e9772dca52d","Type":"ContainerDied","Data":"70da35dea19ed1e6b7bc1057598c17e82450bd4aa8e04b6db6ad8e73115c2027"} Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.445914 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.446683 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-central-agent" containerID="cri-o://f01f82f56aa71e36fde7f145c0385eaeae4d5965d7203a968f0c46de4fc9007f" gracePeriod=30 Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.447120 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="proxy-httpd" containerID="cri-o://c084508788b791501c6bf49e27226df5172fcd49a4ad5c814670c2a71681c96a" gracePeriod=30 Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.447226 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="sg-core" containerID="cri-o://5db80ecd0befa58413f7c6482fd136ca4482c2fc6416b1c75f972078eb904c64" gracePeriod=30 Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.447159 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-notification-agent" containerID="cri-o://a8a64995955c40401bcb9aef86c4a826c4b3e609c9b33585a2c290a888961e6a" gracePeriod=30 Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.487226 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.487284 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.113714 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.113977 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-log" containerID="cri-o://8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9" gracePeriod=30 Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.115011 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-api" containerID="cri-o://39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d" gracePeriod=30 Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.298570 4722 generic.go:334] "Generic (PLEG): container finished" podID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerID="c084508788b791501c6bf49e27226df5172fcd49a4ad5c814670c2a71681c96a" exitCode=0 Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.298883 4722 generic.go:334] "Generic (PLEG): container finished" podID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerID="5db80ecd0befa58413f7c6482fd136ca4482c2fc6416b1c75f972078eb904c64" exitCode=2 Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.298895 4722 generic.go:334] "Generic (PLEG): container finished" podID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerID="f01f82f56aa71e36fde7f145c0385eaeae4d5965d7203a968f0c46de4fc9007f" exitCode=0 Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.298631 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerDied","Data":"c084508788b791501c6bf49e27226df5172fcd49a4ad5c814670c2a71681c96a"} Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.298959 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerDied","Data":"5db80ecd0befa58413f7c6482fd136ca4482c2fc6416b1c75f972078eb904c64"} Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.298972 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerDied","Data":"f01f82f56aa71e36fde7f145c0385eaeae4d5965d7203a968f0c46de4fc9007f"} Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.300973 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78468d7767-275dc" event={"ID":"3daa70c7-4339-4dad-8531-4e9772dca52d","Type":"ContainerStarted","Data":"91f131a0d385272e4122e2d803aa86a0220bba57e175fca1b464af0e6587a981"} Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.301175 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.302859 4722 generic.go:334] "Generic (PLEG): container finished" podID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerID="8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9" exitCode=143 Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.302893 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3c41fb8-288c-4c58-b50c-2b253d825fee","Type":"ContainerDied","Data":"8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9"} Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.322971 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78468d7767-275dc" podStartSLOduration=3.3229453749999998 podStartE2EDuration="3.322945375s" podCreationTimestamp="2026-02-26 20:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:54.319160431 +0000 UTC m=+1356.856128355" watchObservedRunningTime="2026-02-26 20:16:54.322945375 +0000 UTC m=+1356.859913319" Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.956717 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.362256 4722 generic.go:334] "Generic (PLEG): container finished" podID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerID="a8a64995955c40401bcb9aef86c4a826c4b3e609c9b33585a2c290a888961e6a" exitCode=0 Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.362449 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerDied","Data":"a8a64995955c40401bcb9aef86c4a826c4b3e609c9b33585a2c290a888961e6a"} Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.538536 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628171 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-combined-ca-bundle\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628234 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-sg-core-conf-yaml\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628471 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-run-httpd\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628501 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv6hg\" (UniqueName: \"kubernetes.io/projected/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-kube-api-access-zv6hg\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628533 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-log-httpd\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628559 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-ceilometer-tls-certs\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-config-data\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628676 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-scripts\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.629065 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.629554 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.635094 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-scripts" (OuterVolumeSpecName: "scripts") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.637418 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-kube-api-access-zv6hg" (OuterVolumeSpecName: "kube-api-access-zv6hg") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "kube-api-access-zv6hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.659891 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.684103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.731071 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.731103 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.731112 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv6hg\" (UniqueName: \"kubernetes.io/projected/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-kube-api-access-zv6hg\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.731125 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.731143 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.731153 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.737584 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.744460 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.772331 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-config-data" (OuterVolumeSpecName: "config-data") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832082 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-config-data\") pod \"f3c41fb8-288c-4c58-b50c-2b253d825fee\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832156 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c41fb8-288c-4c58-b50c-2b253d825fee-logs\") pod \"f3c41fb8-288c-4c58-b50c-2b253d825fee\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832246 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-combined-ca-bundle\") pod \"f3c41fb8-288c-4c58-b50c-2b253d825fee\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832400 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56lpq\" (UniqueName: \"kubernetes.io/projected/f3c41fb8-288c-4c58-b50c-2b253d825fee-kube-api-access-56lpq\") pod \"f3c41fb8-288c-4c58-b50c-2b253d825fee\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832541 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c41fb8-288c-4c58-b50c-2b253d825fee-logs" (OuterVolumeSpecName: "logs") pod "f3c41fb8-288c-4c58-b50c-2b253d825fee" (UID: "f3c41fb8-288c-4c58-b50c-2b253d825fee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832903 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832918 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832927 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c41fb8-288c-4c58-b50c-2b253d825fee-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.836896 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c41fb8-288c-4c58-b50c-2b253d825fee-kube-api-access-56lpq" (OuterVolumeSpecName: "kube-api-access-56lpq") pod "f3c41fb8-288c-4c58-b50c-2b253d825fee" (UID: "f3c41fb8-288c-4c58-b50c-2b253d825fee"). InnerVolumeSpecName "kube-api-access-56lpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.863639 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3c41fb8-288c-4c58-b50c-2b253d825fee" (UID: "f3c41fb8-288c-4c58-b50c-2b253d825fee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.871224 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-config-data" (OuterVolumeSpecName: "config-data") pod "f3c41fb8-288c-4c58-b50c-2b253d825fee" (UID: "f3c41fb8-288c-4c58-b50c-2b253d825fee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.941657 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.941980 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.942025 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56lpq\" (UniqueName: \"kubernetes.io/projected/f3c41fb8-288c-4c58-b50c-2b253d825fee-kube-api-access-56lpq\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.375747 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerDied","Data":"6488ce6ddcf79c8ae6166c4b8741984494dd8675bb7d217b88703c0468d5bf86"} Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.376925 4722 scope.go:117] "RemoveContainer" containerID="c084508788b791501c6bf49e27226df5172fcd49a4ad5c814670c2a71681c96a" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.375806 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.377906 4722 generic.go:334] "Generic (PLEG): container finished" podID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerID="39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d" exitCode=0 Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.377946 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3c41fb8-288c-4c58-b50c-2b253d825fee","Type":"ContainerDied","Data":"39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d"} Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.377971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3c41fb8-288c-4c58-b50c-2b253d825fee","Type":"ContainerDied","Data":"afde955f5adee5fe07eeceb7c91cd2bc684d0aa3d81faf072b6c75b19cc8a595"} Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.378024 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.416817 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.436401 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.443304 4722 scope.go:117] "RemoveContainer" containerID="5db80ecd0befa58413f7c6482fd136ca4482c2fc6416b1c75f972078eb904c64" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.448704 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460036 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.460492 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-api" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460510 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-api" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.460518 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="sg-core" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460524 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="sg-core" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.460542 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-notification-agent" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460549 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-notification-agent" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.460561 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-central-agent" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460566 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-central-agent" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.460589 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="proxy-httpd" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460596 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="proxy-httpd" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.460605 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-log" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460611 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-log" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460790 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-notification-agent" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460811 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="proxy-httpd" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460822 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-api" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460835 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-log" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460845 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-central-agent" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460855 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="sg-core" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.462006 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.470449 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.470642 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.471211 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.478078 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.482617 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.500925 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.504075 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.504310 4722 scope.go:117] "RemoveContainer" containerID="a8a64995955c40401bcb9aef86c4a826c4b3e609c9b33585a2c290a888961e6a" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.510771 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.510937 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.519346 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.549211 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.554308 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-public-tls-certs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.554373 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.554402 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-config-data\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.554425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l676f\" (UniqueName: \"kubernetes.io/projected/63dff4e6-3f4e-4962-bcd3-99144a5948cc-kube-api-access-l676f\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.554509 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.554530 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dff4e6-3f4e-4962-bcd3-99144a5948cc-logs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.557305 4722 scope.go:117] "RemoveContainer" containerID="f01f82f56aa71e36fde7f145c0385eaeae4d5965d7203a968f0c46de4fc9007f" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.599468 4722 scope.go:117] "RemoveContainer" containerID="39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656609 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dff4e6-3f4e-4962-bcd3-99144a5948cc-logs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-run-httpd\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656717 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-log-httpd\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656761 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwnrb\" (UniqueName: \"kubernetes.io/projected/97329a8f-4016-43a9-8589-ee3c1b05aacb-kube-api-access-pwnrb\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656790 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-scripts\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-public-tls-certs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656893 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-config-data\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656961 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l676f\" (UniqueName: \"kubernetes.io/projected/63dff4e6-3f4e-4962-bcd3-99144a5948cc-kube-api-access-l676f\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656995 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.657054 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.657115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-config-data\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.658345 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dff4e6-3f4e-4962-bcd3-99144a5948cc-logs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.659770 4722 scope.go:117] "RemoveContainer" containerID="8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.661643 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.662218 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-public-tls-certs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.662331 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.663686 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-config-data\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.690165 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l676f\" (UniqueName: \"kubernetes.io/projected/63dff4e6-3f4e-4962-bcd3-99144a5948cc-kube-api-access-l676f\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.758634 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-run-httpd\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.758990 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-log-httpd\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwnrb\" (UniqueName: \"kubernetes.io/projected/97329a8f-4016-43a9-8589-ee3c1b05aacb-kube-api-access-pwnrb\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759062 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-scripts\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759105 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759175 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759211 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759228 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-config-data\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759253 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-run-httpd\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759638 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-log-httpd\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.762738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.763989 4722 scope.go:117] "RemoveContainer" containerID="39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.764306 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.764510 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d\": container with ID starting with 39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d not found: ID does not exist" containerID="39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.764654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-scripts\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.764647 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d"} err="failed to get container status \"39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d\": rpc error: code = NotFound desc = could not find container \"39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d\": container with ID starting with 39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d not found: ID does not exist" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.764712 4722 scope.go:117] "RemoveContainer" containerID="8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.764939 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9\": container with ID starting with 8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9 not found: ID does not exist" containerID="8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.765011 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9"} err="failed to get container status \"8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9\": rpc error: code = NotFound desc = could not find container \"8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9\": container with ID starting with 8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9 not found: ID does not exist" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.765018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.769256 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-config-data\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.780082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwnrb\" (UniqueName: \"kubernetes.io/projected/97329a8f-4016-43a9-8589-ee3c1b05aacb-kube-api-access-pwnrb\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.811411 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.836290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:59 crc kubenswrapper[4722]: I0226 20:16:59.277012 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:59 crc kubenswrapper[4722]: I0226 20:16:59.391632 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63dff4e6-3f4e-4962-bcd3-99144a5948cc","Type":"ContainerStarted","Data":"505acbc17e558c7054431a853ca079dd636a8bf61f9213e492058c47f1c13364"} Feb 26 20:16:59 crc kubenswrapper[4722]: I0226 20:16:59.396172 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:59 crc kubenswrapper[4722]: W0226 20:16:59.396310 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97329a8f_4016_43a9_8589_ee3c1b05aacb.slice/crio-0c3827c3d265381feed608569950c52480366bdddc88bf26fde3808bcf8ea656 WatchSource:0}: Error finding container 0c3827c3d265381feed608569950c52480366bdddc88bf26fde3808bcf8ea656: Status 404 returned error can't find the container with id 0c3827c3d265381feed608569950c52480366bdddc88bf26fde3808bcf8ea656 Feb 26 20:16:59 crc kubenswrapper[4722]: I0226 20:16:59.956974 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:59 crc kubenswrapper[4722]: I0226 20:16:59.977360 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.157604 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" path="/var/lib/kubelet/pods/831065e5-8f9c-4cc4-bffa-e2d82a3a2244/volumes" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.158366 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" path="/var/lib/kubelet/pods/f3c41fb8-288c-4c58-b50c-2b253d825fee/volumes" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.405700 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63dff4e6-3f4e-4962-bcd3-99144a5948cc","Type":"ContainerStarted","Data":"17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25"} Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.405766 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63dff4e6-3f4e-4962-bcd3-99144a5948cc","Type":"ContainerStarted","Data":"5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057"} Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.411823 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerStarted","Data":"24fc979476066d121fc80e0768e9cb5de3e9300b88768c1694124cc55324abe8"} Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.411885 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerStarted","Data":"0c3827c3d265381feed608569950c52480366bdddc88bf26fde3808bcf8ea656"} Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.438031 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.439234 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.439203236 podStartE2EDuration="2.439203236s" podCreationTimestamp="2026-02-26 20:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:17:00.425813105 +0000 UTC m=+1362.962781049" watchObservedRunningTime="2026-02-26 20:17:00.439203236 +0000 UTC m=+1362.976171200" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.641184 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-2m4wz"] Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.642606 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.645253 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.646919 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.658488 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2m4wz"] Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.717693 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjdc7\" (UniqueName: \"kubernetes.io/projected/8b3d3547-11a7-4e10-b57a-a057d2c60e70-kube-api-access-rjdc7\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.717761 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.717787 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-scripts\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.717844 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-config-data\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.820324 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjdc7\" (UniqueName: \"kubernetes.io/projected/8b3d3547-11a7-4e10-b57a-a057d2c60e70-kube-api-access-rjdc7\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.820769 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.821860 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-scripts\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.821987 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-config-data\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.825599 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-config-data\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.826679 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-scripts\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.828609 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.842742 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjdc7\" (UniqueName: \"kubernetes.io/projected/8b3d3547-11a7-4e10-b57a-a057d2c60e70-kube-api-access-rjdc7\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:01 crc kubenswrapper[4722]: I0226 20:17:01.033299 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:01 crc kubenswrapper[4722]: I0226 20:17:01.394923 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:17:01 crc kubenswrapper[4722]: I0226 20:17:01.428937 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerStarted","Data":"5c9fde1c70c575ff37ad6cbf6bc8b09e97f9f6cf1d0a61af87a62fcf2de950d0"} Feb 26 20:17:01 crc kubenswrapper[4722]: I0226 20:17:01.525839 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c9cb78d75-d525c"] Feb 26 20:17:01 crc kubenswrapper[4722]: I0226 20:17:01.526130 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" containerName="dnsmasq-dns" containerID="cri-o://4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946" gracePeriod=10 Feb 26 20:17:01 crc kubenswrapper[4722]: I0226 20:17:01.638975 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2m4wz"] Feb 26 20:17:01 crc kubenswrapper[4722]: W0226 20:17:01.653427 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b3d3547_11a7_4e10_b57a_a057d2c60e70.slice/crio-e8b13a53f508bb8cebaa4af6648cfd5b02af97db0a7429152a978c719a50d8a8 WatchSource:0}: Error finding container e8b13a53f508bb8cebaa4af6648cfd5b02af97db0a7429152a978c719a50d8a8: Status 404 returned error can't find the container with id e8b13a53f508bb8cebaa4af6648cfd5b02af97db0a7429152a978c719a50d8a8 Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.358393 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.467011 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-nb\") pod \"eaffdc9e-b717-46c2-929f-791a7940268f\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.467110 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-sb\") pod \"eaffdc9e-b717-46c2-929f-791a7940268f\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.473415 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-svc\") pod \"eaffdc9e-b717-46c2-929f-791a7940268f\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.475204 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-swift-storage-0\") pod \"eaffdc9e-b717-46c2-929f-791a7940268f\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.475292 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-config\") pod \"eaffdc9e-b717-46c2-929f-791a7940268f\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.480526 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r82p\" (UniqueName: \"kubernetes.io/projected/eaffdc9e-b717-46c2-929f-791a7940268f-kube-api-access-6r82p\") pod \"eaffdc9e-b717-46c2-929f-791a7940268f\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.487435 4722 generic.go:334] "Generic (PLEG): container finished" podID="eaffdc9e-b717-46c2-929f-791a7940268f" containerID="4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946" exitCode=0 Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.487502 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" event={"ID":"eaffdc9e-b717-46c2-929f-791a7940268f","Type":"ContainerDied","Data":"4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946"} Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.487547 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" event={"ID":"eaffdc9e-b717-46c2-929f-791a7940268f","Type":"ContainerDied","Data":"5d5a507d85444f5424a03cabe1cf4e839a26588e0a7cd89e35d3d55ebf30d4dd"} Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.487563 4722 scope.go:117] "RemoveContainer" containerID="4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.487701 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.500095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2m4wz" event={"ID":"8b3d3547-11a7-4e10-b57a-a057d2c60e70","Type":"ContainerStarted","Data":"c98d7f7d7eb20e44d64016a7dbe95dfe4ce7d86b2359527aa666431b5045009e"} Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.500157 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2m4wz" event={"ID":"8b3d3547-11a7-4e10-b57a-a057d2c60e70","Type":"ContainerStarted","Data":"e8b13a53f508bb8cebaa4af6648cfd5b02af97db0a7429152a978c719a50d8a8"} Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.511931 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaffdc9e-b717-46c2-929f-791a7940268f-kube-api-access-6r82p" (OuterVolumeSpecName: "kube-api-access-6r82p") pod "eaffdc9e-b717-46c2-929f-791a7940268f" (UID: "eaffdc9e-b717-46c2-929f-791a7940268f"). InnerVolumeSpecName "kube-api-access-6r82p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.520627 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerStarted","Data":"f16c84134edede6b0f74dc071dbf831fd0e1b2de7490d40f91b3a09eacd448f7"} Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.540533 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-2m4wz" podStartSLOduration=2.540515354 podStartE2EDuration="2.540515354s" podCreationTimestamp="2026-02-26 20:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:17:02.536201968 +0000 UTC m=+1365.073169902" watchObservedRunningTime="2026-02-26 20:17:02.540515354 +0000 UTC m=+1365.077483278" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.550330 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-config" (OuterVolumeSpecName: "config") pod "eaffdc9e-b717-46c2-929f-791a7940268f" (UID: "eaffdc9e-b717-46c2-929f-791a7940268f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.568316 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eaffdc9e-b717-46c2-929f-791a7940268f" (UID: "eaffdc9e-b717-46c2-929f-791a7940268f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.577026 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eaffdc9e-b717-46c2-929f-791a7940268f" (UID: "eaffdc9e-b717-46c2-929f-791a7940268f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.585562 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eaffdc9e-b717-46c2-929f-791a7940268f" (UID: "eaffdc9e-b717-46c2-929f-791a7940268f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.587896 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r82p\" (UniqueName: \"kubernetes.io/projected/eaffdc9e-b717-46c2-929f-791a7940268f-kube-api-access-6r82p\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.587924 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.587933 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.587943 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.587953 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.615369 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eaffdc9e-b717-46c2-929f-791a7940268f" (UID: "eaffdc9e-b717-46c2-929f-791a7940268f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.631492 4722 scope.go:117] "RemoveContainer" containerID="0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.656060 4722 scope.go:117] "RemoveContainer" containerID="4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946" Feb 26 20:17:02 crc kubenswrapper[4722]: E0226 20:17:02.659093 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946\": container with ID starting with 4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946 not found: ID does not exist" containerID="4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.659161 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946"} err="failed to get container status \"4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946\": rpc error: code = NotFound desc = could not find container \"4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946\": container with ID starting with 4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946 not found: ID does not exist" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.659194 4722 scope.go:117] "RemoveContainer" containerID="0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de" Feb 26 20:17:02 crc kubenswrapper[4722]: E0226 20:17:02.662882 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de\": container with ID starting with 0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de not found: ID does not exist" containerID="0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.662926 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de"} err="failed to get container status \"0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de\": rpc error: code = NotFound desc = could not find container \"0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de\": container with ID starting with 0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de not found: ID does not exist" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.690343 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.827314 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c9cb78d75-d525c"] Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.841266 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c9cb78d75-d525c"] Feb 26 20:17:03 crc kubenswrapper[4722]: I0226 20:17:03.534622 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerStarted","Data":"ee2b72bbdef2561e9930658c500b9a220451de6db94cf1b7c41aabacee3b050f"} Feb 26 20:17:03 crc kubenswrapper[4722]: I0226 20:17:03.535839 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:17:03 crc kubenswrapper[4722]: I0226 20:17:03.571651 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7290036020000001 podStartE2EDuration="5.571633227s" podCreationTimestamp="2026-02-26 20:16:58 +0000 UTC" firstStartedPulling="2026-02-26 20:16:59.398444884 +0000 UTC m=+1361.935412808" lastFinishedPulling="2026-02-26 20:17:03.241074509 +0000 UTC m=+1365.778042433" observedRunningTime="2026-02-26 20:17:03.555964425 +0000 UTC m=+1366.092932359" watchObservedRunningTime="2026-02-26 20:17:03.571633227 +0000 UTC m=+1366.108601151" Feb 26 20:17:04 crc kubenswrapper[4722]: I0226 20:17:04.157459 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" path="/var/lib/kubelet/pods/eaffdc9e-b717-46c2-929f-791a7940268f/volumes" Feb 26 20:17:07 crc kubenswrapper[4722]: I0226 20:17:07.583605 4722 generic.go:334] "Generic (PLEG): container finished" podID="8b3d3547-11a7-4e10-b57a-a057d2c60e70" containerID="c98d7f7d7eb20e44d64016a7dbe95dfe4ce7d86b2359527aa666431b5045009e" exitCode=0 Feb 26 20:17:07 crc kubenswrapper[4722]: I0226 20:17:07.583693 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2m4wz" event={"ID":"8b3d3547-11a7-4e10-b57a-a057d2c60e70","Type":"ContainerDied","Data":"c98d7f7d7eb20e44d64016a7dbe95dfe4ce7d86b2359527aa666431b5045009e"} Feb 26 20:17:08 crc kubenswrapper[4722]: I0226 20:17:08.811575 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:17:08 crc kubenswrapper[4722]: I0226 20:17:08.811899 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.098437 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.215064 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-scripts\") pod \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.215510 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-combined-ca-bundle\") pod \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.215599 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-config-data\") pod \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.215694 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjdc7\" (UniqueName: \"kubernetes.io/projected/8b3d3547-11a7-4e10-b57a-a057d2c60e70-kube-api-access-rjdc7\") pod \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.222615 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-scripts" (OuterVolumeSpecName: "scripts") pod "8b3d3547-11a7-4e10-b57a-a057d2c60e70" (UID: "8b3d3547-11a7-4e10-b57a-a057d2c60e70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.234561 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3d3547-11a7-4e10-b57a-a057d2c60e70-kube-api-access-rjdc7" (OuterVolumeSpecName: "kube-api-access-rjdc7") pod "8b3d3547-11a7-4e10-b57a-a057d2c60e70" (UID: "8b3d3547-11a7-4e10-b57a-a057d2c60e70"). InnerVolumeSpecName "kube-api-access-rjdc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.244851 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-config-data" (OuterVolumeSpecName: "config-data") pod "8b3d3547-11a7-4e10-b57a-a057d2c60e70" (UID: "8b3d3547-11a7-4e10-b57a-a057d2c60e70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.248173 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b3d3547-11a7-4e10-b57a-a057d2c60e70" (UID: "8b3d3547-11a7-4e10-b57a-a057d2c60e70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.317949 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjdc7\" (UniqueName: \"kubernetes.io/projected/8b3d3547-11a7-4e10-b57a-a057d2c60e70-kube-api-access-rjdc7\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.318006 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.318021 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.318033 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.609124 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2m4wz" event={"ID":"8b3d3547-11a7-4e10-b57a-a057d2c60e70","Type":"ContainerDied","Data":"e8b13a53f508bb8cebaa4af6648cfd5b02af97db0a7429152a978c719a50d8a8"} Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.609517 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8b13a53f508bb8cebaa4af6648cfd5b02af97db0a7429152a978c719a50d8a8" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.609589 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.795182 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.795434 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-log" containerID="cri-o://5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057" gracePeriod=30 Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.795475 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-api" containerID="cri-o://17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25" gracePeriod=30 Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.802670 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": EOF" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.802669 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": EOF" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.820418 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.821968 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3c669562-253c-4085-9e5c-04dfd8ae4338" containerName="nova-scheduler-scheduler" containerID="cri-o://9f0943fe618a3b57a0ed929e5884131cb9a6db2dc795d2418858e3a142665492" gracePeriod=30 Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.835338 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.835620 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-log" containerID="cri-o://4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047" gracePeriod=30 Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.835652 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-metadata" containerID="cri-o://57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65" gracePeriod=30 Feb 26 20:17:10 crc kubenswrapper[4722]: I0226 20:17:10.621357 4722 generic.go:334] "Generic (PLEG): container finished" podID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerID="5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057" exitCode=143 Feb 26 20:17:10 crc kubenswrapper[4722]: I0226 20:17:10.621662 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63dff4e6-3f4e-4962-bcd3-99144a5948cc","Type":"ContainerDied","Data":"5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057"} Feb 26 20:17:10 crc kubenswrapper[4722]: I0226 20:17:10.631772 4722 generic.go:334] "Generic (PLEG): container finished" podID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerID="4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047" exitCode=143 Feb 26 20:17:10 crc kubenswrapper[4722]: I0226 20:17:10.631830 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11dbf936-bb20-4a48-a17c-4814f49ffddd","Type":"ContainerDied","Data":"4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047"} Feb 26 20:17:10 crc kubenswrapper[4722]: I0226 20:17:10.635005 4722 generic.go:334] "Generic (PLEG): container finished" podID="3c669562-253c-4085-9e5c-04dfd8ae4338" containerID="9f0943fe618a3b57a0ed929e5884131cb9a6db2dc795d2418858e3a142665492" exitCode=0 Feb 26 20:17:10 crc kubenswrapper[4722]: I0226 20:17:10.635030 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c669562-253c-4085-9e5c-04dfd8ae4338","Type":"ContainerDied","Data":"9f0943fe618a3b57a0ed929e5884131cb9a6db2dc795d2418858e3a142665492"} Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.149917 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.275100 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-config-data\") pod \"3c669562-253c-4085-9e5c-04dfd8ae4338\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.275367 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgqzc\" (UniqueName: \"kubernetes.io/projected/3c669562-253c-4085-9e5c-04dfd8ae4338-kube-api-access-pgqzc\") pod \"3c669562-253c-4085-9e5c-04dfd8ae4338\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.275479 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-combined-ca-bundle\") pod \"3c669562-253c-4085-9e5c-04dfd8ae4338\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.297916 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c669562-253c-4085-9e5c-04dfd8ae4338-kube-api-access-pgqzc" (OuterVolumeSpecName: "kube-api-access-pgqzc") pod "3c669562-253c-4085-9e5c-04dfd8ae4338" (UID: "3c669562-253c-4085-9e5c-04dfd8ae4338"). InnerVolumeSpecName "kube-api-access-pgqzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.311350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-config-data" (OuterVolumeSpecName: "config-data") pod "3c669562-253c-4085-9e5c-04dfd8ae4338" (UID: "3c669562-253c-4085-9e5c-04dfd8ae4338"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.314302 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c669562-253c-4085-9e5c-04dfd8ae4338" (UID: "3c669562-253c-4085-9e5c-04dfd8ae4338"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.378419 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.378450 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.378460 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgqzc\" (UniqueName: \"kubernetes.io/projected/3c669562-253c-4085-9e5c-04dfd8ae4338-kube-api-access-pgqzc\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.647771 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c669562-253c-4085-9e5c-04dfd8ae4338","Type":"ContainerDied","Data":"8686a45b85265024b1d70abf9cccd3404c423334af28bd17d8f02b40dd77263e"} Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.647827 4722 scope.go:117] "RemoveContainer" containerID="9f0943fe618a3b57a0ed929e5884131cb9a6db2dc795d2418858e3a142665492" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.647874 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.688036 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.699535 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.718889 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:17:11 crc kubenswrapper[4722]: E0226 20:17:11.719469 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c669562-253c-4085-9e5c-04dfd8ae4338" containerName="nova-scheduler-scheduler" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719492 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c669562-253c-4085-9e5c-04dfd8ae4338" containerName="nova-scheduler-scheduler" Feb 26 20:17:11 crc kubenswrapper[4722]: E0226 20:17:11.719520 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3d3547-11a7-4e10-b57a-a057d2c60e70" containerName="nova-manage" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719530 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3d3547-11a7-4e10-b57a-a057d2c60e70" containerName="nova-manage" Feb 26 20:17:11 crc kubenswrapper[4722]: E0226 20:17:11.719546 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" containerName="init" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719554 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" containerName="init" Feb 26 20:17:11 crc kubenswrapper[4722]: E0226 20:17:11.719565 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" containerName="dnsmasq-dns" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719573 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" containerName="dnsmasq-dns" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719829 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c669562-253c-4085-9e5c-04dfd8ae4338" containerName="nova-scheduler-scheduler" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719859 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3d3547-11a7-4e10-b57a-a057d2c60e70" containerName="nova-manage" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719877 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" containerName="dnsmasq-dns" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.720849 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.724399 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.736385 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.785364 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6xr\" (UniqueName: \"kubernetes.io/projected/37cb4b4d-ebfb-4070-b002-a20ec25dce18-kube-api-access-2k6xr\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.785449 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cb4b4d-ebfb-4070-b002-a20ec25dce18-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.785672 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cb4b4d-ebfb-4070-b002-a20ec25dce18-config-data\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.886937 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cb4b4d-ebfb-4070-b002-a20ec25dce18-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.887094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cb4b4d-ebfb-4070-b002-a20ec25dce18-config-data\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.887151 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6xr\" (UniqueName: \"kubernetes.io/projected/37cb4b4d-ebfb-4070-b002-a20ec25dce18-kube-api-access-2k6xr\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.890655 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cb4b4d-ebfb-4070-b002-a20ec25dce18-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.890660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cb4b4d-ebfb-4070-b002-a20ec25dce18-config-data\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.902282 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6xr\" (UniqueName: \"kubernetes.io/projected/37cb4b4d-ebfb-4070-b002-a20ec25dce18-kube-api-access-2k6xr\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:12 crc kubenswrapper[4722]: I0226 20:17:12.042859 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:17:12 crc kubenswrapper[4722]: I0226 20:17:12.164065 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c669562-253c-4085-9e5c-04dfd8ae4338" path="/var/lib/kubelet/pods/3c669562-253c-4085-9e5c-04dfd8ae4338/volumes" Feb 26 20:17:12 crc kubenswrapper[4722]: I0226 20:17:12.512327 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:17:12 crc kubenswrapper[4722]: I0226 20:17:12.657955 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"37cb4b4d-ebfb-4070-b002-a20ec25dce18","Type":"ContainerStarted","Data":"2c2d5b50f812a895fd2046d3a02ac19bb5597f3d1ae6da3f17acd8309a8450a4"} Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.587794 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.621917 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkzcf\" (UniqueName: \"kubernetes.io/projected/11dbf936-bb20-4a48-a17c-4814f49ffddd-kube-api-access-nkzcf\") pod \"11dbf936-bb20-4a48-a17c-4814f49ffddd\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.622004 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-combined-ca-bundle\") pod \"11dbf936-bb20-4a48-a17c-4814f49ffddd\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.622130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-config-data\") pod \"11dbf936-bb20-4a48-a17c-4814f49ffddd\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.622221 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dbf936-bb20-4a48-a17c-4814f49ffddd-logs\") pod \"11dbf936-bb20-4a48-a17c-4814f49ffddd\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.622449 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-nova-metadata-tls-certs\") pod \"11dbf936-bb20-4a48-a17c-4814f49ffddd\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.629677 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11dbf936-bb20-4a48-a17c-4814f49ffddd-logs" (OuterVolumeSpecName: "logs") pod "11dbf936-bb20-4a48-a17c-4814f49ffddd" (UID: "11dbf936-bb20-4a48-a17c-4814f49ffddd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.650290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11dbf936-bb20-4a48-a17c-4814f49ffddd-kube-api-access-nkzcf" (OuterVolumeSpecName: "kube-api-access-nkzcf") pod "11dbf936-bb20-4a48-a17c-4814f49ffddd" (UID: "11dbf936-bb20-4a48-a17c-4814f49ffddd"). InnerVolumeSpecName "kube-api-access-nkzcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.681966 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-config-data" (OuterVolumeSpecName: "config-data") pod "11dbf936-bb20-4a48-a17c-4814f49ffddd" (UID: "11dbf936-bb20-4a48-a17c-4814f49ffddd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.688969 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "11dbf936-bb20-4a48-a17c-4814f49ffddd" (UID: "11dbf936-bb20-4a48-a17c-4814f49ffddd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.691985 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"37cb4b4d-ebfb-4070-b002-a20ec25dce18","Type":"ContainerStarted","Data":"7ecdb1487fe012f8b0eb21c2fe1b56e5b1f978d54439d4e5a3c9d7e8c055d07c"} Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.696739 4722 generic.go:334] "Generic (PLEG): container finished" podID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerID="57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65" exitCode=0 Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.696779 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11dbf936-bb20-4a48-a17c-4814f49ffddd","Type":"ContainerDied","Data":"57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65"} Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.696802 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11dbf936-bb20-4a48-a17c-4814f49ffddd","Type":"ContainerDied","Data":"99b45ccd7259dc6971651474be7d9bbc67d18a6f7cb66455f79f3ed0b70a8a14"} Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.696824 4722 scope.go:117] "RemoveContainer" containerID="57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.696930 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.725545 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.725526614 podStartE2EDuration="2.725526614s" podCreationTimestamp="2026-02-26 20:17:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:17:13.718595027 +0000 UTC m=+1376.255562951" watchObservedRunningTime="2026-02-26 20:17:13.725526614 +0000 UTC m=+1376.262494538" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.732317 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.732365 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkzcf\" (UniqueName: \"kubernetes.io/projected/11dbf936-bb20-4a48-a17c-4814f49ffddd-kube-api-access-nkzcf\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.732383 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.732396 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dbf936-bb20-4a48-a17c-4814f49ffddd-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.758414 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11dbf936-bb20-4a48-a17c-4814f49ffddd" (UID: "11dbf936-bb20-4a48-a17c-4814f49ffddd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.773031 4722 scope.go:117] "RemoveContainer" containerID="4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.795314 4722 scope.go:117] "RemoveContainer" containerID="57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65" Feb 26 20:17:13 crc kubenswrapper[4722]: E0226 20:17:13.795828 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65\": container with ID starting with 57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65 not found: ID does not exist" containerID="57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.795918 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65"} err="failed to get container status \"57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65\": rpc error: code = NotFound desc = could not find container \"57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65\": container with ID starting with 57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65 not found: ID does not exist" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.795989 4722 scope.go:117] "RemoveContainer" containerID="4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047" Feb 26 20:17:13 crc kubenswrapper[4722]: E0226 20:17:13.796572 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047\": container with ID starting with 4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047 not found: ID does not exist" containerID="4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.796649 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047"} err="failed to get container status \"4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047\": rpc error: code = NotFound desc = could not find container \"4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047\": container with ID starting with 4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047 not found: ID does not exist" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.834426 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.032814 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.042994 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.057411 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:17:14 crc kubenswrapper[4722]: E0226 20:17:14.057859 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-metadata" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.057875 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-metadata" Feb 26 20:17:14 crc kubenswrapper[4722]: E0226 20:17:14.057905 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-log" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.057913 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-log" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.058187 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-log" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.058211 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-metadata" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.059315 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.062305 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.062417 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.109731 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.185589 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" path="/var/lib/kubelet/pods/11dbf936-bb20-4a48-a17c-4814f49ffddd/volumes" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.242066 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.242152 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-logs\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.242184 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkjg6\" (UniqueName: \"kubernetes.io/projected/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-kube-api-access-bkjg6\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.242313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-config-data\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.242342 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.344596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-config-data\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.344641 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.344745 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.344777 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-logs\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.344803 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkjg6\" (UniqueName: \"kubernetes.io/projected/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-kube-api-access-bkjg6\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.345357 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-logs\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.349774 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-config-data\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.360377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.360789 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkjg6\" (UniqueName: \"kubernetes.io/projected/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-kube-api-access-bkjg6\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.375716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.461730 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.920068 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:17:14 crc kubenswrapper[4722]: W0226 20:17:14.920513 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5d6d9cc_9697_46cc_ab38_7879ef449ab3.slice/crio-c9b14c5bffa5f41dd7bfcd387a4eeac28d9c1197d825fc1b70d9eee7f93dbfef WatchSource:0}: Error finding container c9b14c5bffa5f41dd7bfcd387a4eeac28d9c1197d825fc1b70d9eee7f93dbfef: Status 404 returned error can't find the container with id c9b14c5bffa5f41dd7bfcd387a4eeac28d9c1197d825fc1b70d9eee7f93dbfef Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.639886 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.713062 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-internal-tls-certs\") pod \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.713185 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-public-tls-certs\") pod \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.713216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dff4e6-3f4e-4962-bcd3-99144a5948cc-logs\") pod \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.713241 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l676f\" (UniqueName: \"kubernetes.io/projected/63dff4e6-3f4e-4962-bcd3-99144a5948cc-kube-api-access-l676f\") pod \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.713271 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-combined-ca-bundle\") pod \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.713341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-config-data\") pod \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.714094 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63dff4e6-3f4e-4962-bcd3-99144a5948cc-logs" (OuterVolumeSpecName: "logs") pod "63dff4e6-3f4e-4962-bcd3-99144a5948cc" (UID: "63dff4e6-3f4e-4962-bcd3-99144a5948cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.718834 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63dff4e6-3f4e-4962-bcd3-99144a5948cc-kube-api-access-l676f" (OuterVolumeSpecName: "kube-api-access-l676f") pod "63dff4e6-3f4e-4962-bcd3-99144a5948cc" (UID: "63dff4e6-3f4e-4962-bcd3-99144a5948cc"). InnerVolumeSpecName "kube-api-access-l676f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.721550 4722 generic.go:334] "Generic (PLEG): container finished" podID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerID="17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25" exitCode=0 Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.721697 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63dff4e6-3f4e-4962-bcd3-99144a5948cc","Type":"ContainerDied","Data":"17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25"} Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.721795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63dff4e6-3f4e-4962-bcd3-99144a5948cc","Type":"ContainerDied","Data":"505acbc17e558c7054431a853ca079dd636a8bf61f9213e492058c47f1c13364"} Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.721896 4722 scope.go:117] "RemoveContainer" containerID="17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.722092 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.725115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5d6d9cc-9697-46cc-ab38-7879ef449ab3","Type":"ContainerStarted","Data":"079cf554d4977ea59dfd739fc850a62fb34fe6d4d948a6bb72a86cff1c5c667e"} Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.725353 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5d6d9cc-9697-46cc-ab38-7879ef449ab3","Type":"ContainerStarted","Data":"857f7e2727575929c5833b7dcfd55dcd59e35e358ff0876d40270623a25470a1"} Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.725434 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5d6d9cc-9697-46cc-ab38-7879ef449ab3","Type":"ContainerStarted","Data":"c9b14c5bffa5f41dd7bfcd387a4eeac28d9c1197d825fc1b70d9eee7f93dbfef"} Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.746931 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.74690911 podStartE2EDuration="1.74690911s" podCreationTimestamp="2026-02-26 20:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:17:15.743880119 +0000 UTC m=+1378.280848063" watchObservedRunningTime="2026-02-26 20:17:15.74690911 +0000 UTC m=+1378.283877044" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.753295 4722 scope.go:117] "RemoveContainer" containerID="5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.768897 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-config-data" (OuterVolumeSpecName: "config-data") pod "63dff4e6-3f4e-4962-bcd3-99144a5948cc" (UID: "63dff4e6-3f4e-4962-bcd3-99144a5948cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.776569 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63dff4e6-3f4e-4962-bcd3-99144a5948cc" (UID: "63dff4e6-3f4e-4962-bcd3-99144a5948cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.778679 4722 scope.go:117] "RemoveContainer" containerID="17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.779088 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "63dff4e6-3f4e-4962-bcd3-99144a5948cc" (UID: "63dff4e6-3f4e-4962-bcd3-99144a5948cc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:15 crc kubenswrapper[4722]: E0226 20:17:15.779388 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25\": container with ID starting with 17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25 not found: ID does not exist" containerID="17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.779436 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25"} err="failed to get container status \"17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25\": rpc error: code = NotFound desc = could not find container \"17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25\": container with ID starting with 17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25 not found: ID does not exist" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.779465 4722 scope.go:117] "RemoveContainer" containerID="5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057" Feb 26 20:17:15 crc kubenswrapper[4722]: E0226 20:17:15.779802 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057\": container with ID starting with 5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057 not found: ID does not exist" containerID="5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.779826 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057"} err="failed to get container status \"5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057\": rpc error: code = NotFound desc = could not find container \"5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057\": container with ID starting with 5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057 not found: ID does not exist" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.795544 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "63dff4e6-3f4e-4962-bcd3-99144a5948cc" (UID: "63dff4e6-3f4e-4962-bcd3-99144a5948cc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.815885 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.815933 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.815946 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dff4e6-3f4e-4962-bcd3-99144a5948cc-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.815959 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l676f\" (UniqueName: \"kubernetes.io/projected/63dff4e6-3f4e-4962-bcd3-99144a5948cc-kube-api-access-l676f\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.815982 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.815995 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.062414 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.079457 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.091613 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 20:17:16 crc kubenswrapper[4722]: E0226 20:17:16.092064 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-api" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.092085 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-api" Feb 26 20:17:16 crc kubenswrapper[4722]: E0226 20:17:16.092118 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-log" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.092124 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-log" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.092338 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-log" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.092360 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-api" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.093465 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.097329 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.097575 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.097799 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.121944 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.163187 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" path="/var/lib/kubelet/pods/63dff4e6-3f4e-4962-bcd3-99144a5948cc/volumes" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.223258 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk98t\" (UniqueName: \"kubernetes.io/projected/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-kube-api-access-dk98t\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.223340 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.225961 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-config-data\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.226234 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-logs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.226402 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.226509 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.328906 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk98t\" (UniqueName: \"kubernetes.io/projected/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-kube-api-access-dk98t\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.329377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.329478 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-config-data\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.329584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-logs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.329693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.329749 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.330012 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-logs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.333163 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-config-data\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.333162 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.334592 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.334638 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.364161 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk98t\" (UniqueName: \"kubernetes.io/projected/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-kube-api-access-dk98t\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.456956 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: W0226 20:17:16.889063 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9ddeffe_fdc8_4671_9197_da3818ccdfb1.slice/crio-2f79de316bf2b4a09e645bacf9ab72886ec1c873e97c2344393e42953ce9afde WatchSource:0}: Error finding container 2f79de316bf2b4a09e645bacf9ab72886ec1c873e97c2344393e42953ce9afde: Status 404 returned error can't find the container with id 2f79de316bf2b4a09e645bacf9ab72886ec1c873e97c2344393e42953ce9afde Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.890529 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:17:17 crc kubenswrapper[4722]: I0226 20:17:17.043044 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 20:17:17 crc kubenswrapper[4722]: I0226 20:17:17.749557 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9ddeffe-fdc8-4671-9197-da3818ccdfb1","Type":"ContainerStarted","Data":"f580205512bcb409edc88ad859e29b3e7fcf63cb6bf47635d23d7dbcefa95593"} Feb 26 20:17:17 crc kubenswrapper[4722]: I0226 20:17:17.749890 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9ddeffe-fdc8-4671-9197-da3818ccdfb1","Type":"ContainerStarted","Data":"d047241c71835483074fa29b72375efcb0bd62937eb20a80f1eeac122fc93dd8"} Feb 26 20:17:17 crc kubenswrapper[4722]: I0226 20:17:17.749900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9ddeffe-fdc8-4671-9197-da3818ccdfb1","Type":"ContainerStarted","Data":"2f79de316bf2b4a09e645bacf9ab72886ec1c873e97c2344393e42953ce9afde"} Feb 26 20:17:17 crc kubenswrapper[4722]: I0226 20:17:17.768229 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.7682064149999999 podStartE2EDuration="1.768206415s" podCreationTimestamp="2026-02-26 20:17:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:17:17.763652632 +0000 UTC m=+1380.300620576" watchObservedRunningTime="2026-02-26 20:17:17.768206415 +0000 UTC m=+1380.305174349" Feb 26 20:17:18 crc kubenswrapper[4722]: I0226 20:17:18.227496 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 20:17:18 crc kubenswrapper[4722]: I0226 20:17:18.227496 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 20:17:19 crc kubenswrapper[4722]: I0226 20:17:19.462540 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 20:17:19 crc kubenswrapper[4722]: I0226 20:17:19.462806 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 20:17:22 crc kubenswrapper[4722]: I0226 20:17:22.043863 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 20:17:22 crc kubenswrapper[4722]: I0226 20:17:22.103959 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 20:17:22 crc kubenswrapper[4722]: I0226 20:17:22.850386 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.487031 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.487088 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.487131 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.487919 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6d778fad2f2151e0aabde662094a8e54f4922234ea2496f6de56c2b4fb7262f"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.487976 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://c6d778fad2f2151e0aabde662094a8e54f4922234ea2496f6de56c2b4fb7262f" gracePeriod=600 Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.830753 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="c6d778fad2f2151e0aabde662094a8e54f4922234ea2496f6de56c2b4fb7262f" exitCode=0 Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.830848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"c6d778fad2f2151e0aabde662094a8e54f4922234ea2496f6de56c2b4fb7262f"} Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.831298 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188"} Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.831333 4722 scope.go:117] "RemoveContainer" containerID="0c21285f0689404c517f73494c8146ae2d9c77c8869bf3913d36029a321066ed" Feb 26 20:17:24 crc kubenswrapper[4722]: I0226 20:17:24.462604 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 20:17:24 crc kubenswrapper[4722]: I0226 20:17:24.462933 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 20:17:25 crc kubenswrapper[4722]: I0226 20:17:25.478323 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b5d6d9cc-9697-46cc-ab38-7879ef449ab3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.235:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 20:17:25 crc kubenswrapper[4722]: I0226 20:17:25.478323 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b5d6d9cc-9697-46cc-ab38-7879ef449ab3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.235:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 20:17:26 crc kubenswrapper[4722]: I0226 20:17:26.457616 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:17:26 crc kubenswrapper[4722]: I0226 20:17:26.458037 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:17:27 crc kubenswrapper[4722]: I0226 20:17:27.471405 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f9ddeffe-fdc8-4671-9197-da3818ccdfb1" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.236:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 20:17:27 crc kubenswrapper[4722]: I0226 20:17:27.472227 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f9ddeffe-fdc8-4671-9197-da3818ccdfb1" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.236:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 20:17:28 crc kubenswrapper[4722]: I0226 20:17:28.847925 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 20:17:34 crc kubenswrapper[4722]: I0226 20:17:34.467996 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 20:17:34 crc kubenswrapper[4722]: I0226 20:17:34.468658 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 20:17:34 crc kubenswrapper[4722]: I0226 20:17:34.474654 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 20:17:34 crc kubenswrapper[4722]: I0226 20:17:34.474788 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 20:17:36 crc kubenswrapper[4722]: I0226 20:17:36.530488 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 20:17:36 crc kubenswrapper[4722]: I0226 20:17:36.531478 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 20:17:36 crc kubenswrapper[4722]: I0226 20:17:36.531958 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 20:17:36 crc kubenswrapper[4722]: I0226 20:17:36.561636 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 20:17:36 crc kubenswrapper[4722]: I0226 20:17:36.986586 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 20:17:36 crc kubenswrapper[4722]: I0226 20:17:36.991265 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.420285 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xrhct"] Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.423686 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.450702 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrhct"] Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.555066 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjjl2\" (UniqueName: \"kubernetes.io/projected/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-kube-api-access-kjjl2\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.555128 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-utilities\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.555191 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-catalog-content\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.656687 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-utilities\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.656759 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-catalog-content\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.656897 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjjl2\" (UniqueName: \"kubernetes.io/projected/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-kube-api-access-kjjl2\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.657608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-utilities\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.657813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-catalog-content\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.677258 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjjl2\" (UniqueName: \"kubernetes.io/projected/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-kube-api-access-kjjl2\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.754823 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:45 crc kubenswrapper[4722]: I0226 20:17:45.245296 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrhct"] Feb 26 20:17:45 crc kubenswrapper[4722]: W0226 20:17:45.253639 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e86a70_aac2_4233_bd15_0dd2a1e17d21.slice/crio-960b049f736f2c9e4ef853fc0dd34e254cf8f6e05b88d12195506b153678bbf5 WatchSource:0}: Error finding container 960b049f736f2c9e4ef853fc0dd34e254cf8f6e05b88d12195506b153678bbf5: Status 404 returned error can't find the container with id 960b049f736f2c9e4ef853fc0dd34e254cf8f6e05b88d12195506b153678bbf5 Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.099417 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerID="eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef" exitCode=0 Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.099479 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerDied","Data":"eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef"} Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.099524 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerStarted","Data":"960b049f736f2c9e4ef853fc0dd34e254cf8f6e05b88d12195506b153678bbf5"} Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.460936 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-9bqd7"] Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.471736 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-9bqd7"] Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.567486 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-vfgst"] Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.568978 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.571232 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.577922 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-vfgst"] Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.697118 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-config-data\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.697182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-scripts\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.697459 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxlmf\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-kube-api-access-wxlmf\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.697569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-certs\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.697610 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-combined-ca-bundle\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.799196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxlmf\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-kube-api-access-wxlmf\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.799255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-certs\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.799279 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-combined-ca-bundle\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.799344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-config-data\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.799362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-scripts\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.805428 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-combined-ca-bundle\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.805579 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-config-data\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.806090 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-certs\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.813798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-scripts\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.817504 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxlmf\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-kube-api-access-wxlmf\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.895297 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:47 crc kubenswrapper[4722]: W0226 20:17:47.392554 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf27e7d78_b723_43b0_8734_8892bd8cfd3b.slice/crio-b1e7398a0aeed767314609dfa8528a8e1d673b28a3d00b593910ef737cac1235 WatchSource:0}: Error finding container b1e7398a0aeed767314609dfa8528a8e1d673b28a3d00b593910ef737cac1235: Status 404 returned error can't find the container with id b1e7398a0aeed767314609dfa8528a8e1d673b28a3d00b593910ef737cac1235 Feb 26 20:17:47 crc kubenswrapper[4722]: I0226 20:17:47.395747 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-vfgst"] Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.044204 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.135469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerStarted","Data":"407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020"} Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.138993 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-vfgst" event={"ID":"f27e7d78-b723-43b0-8734-8892bd8cfd3b","Type":"ContainerStarted","Data":"68a6a8b3780fa7e785b92fc5772ce351150e3da25f1c0a02f33bce6d1f924c21"} Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.139030 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-vfgst" event={"ID":"f27e7d78-b723-43b0-8734-8892bd8cfd3b","Type":"ContainerStarted","Data":"b1e7398a0aeed767314609dfa8528a8e1d673b28a3d00b593910ef737cac1235"} Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.167658 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04f47952-580e-40b8-80f0-25d1bf8ccc22" path="/var/lib/kubelet/pods/04f47952-580e-40b8-80f0-25d1bf8ccc22/volumes" Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.188695 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-vfgst" podStartSLOduration=2.02131426 podStartE2EDuration="2.188675135s" podCreationTimestamp="2026-02-26 20:17:46 +0000 UTC" firstStartedPulling="2026-02-26 20:17:47.394426727 +0000 UTC m=+1409.931394661" lastFinishedPulling="2026-02-26 20:17:47.561787612 +0000 UTC m=+1410.098755536" observedRunningTime="2026-02-26 20:17:48.177424292 +0000 UTC m=+1410.714392226" watchObservedRunningTime="2026-02-26 20:17:48.188675135 +0000 UTC m=+1410.725643059" Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.457358 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.457660 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-central-agent" containerID="cri-o://24fc979476066d121fc80e0768e9cb5de3e9300b88768c1694124cc55324abe8" gracePeriod=30 Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.457727 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="sg-core" containerID="cri-o://f16c84134edede6b0f74dc071dbf831fd0e1b2de7490d40f91b3a09eacd448f7" gracePeriod=30 Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.457770 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-notification-agent" containerID="cri-o://5c9fde1c70c575ff37ad6cbf6bc8b09e97f9f6cf1d0a61af87a62fcf2de950d0" gracePeriod=30 Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.457745 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="proxy-httpd" containerID="cri-o://ee2b72bbdef2561e9930658c500b9a220451de6db94cf1b7c41aabacee3b050f" gracePeriod=30 Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.933639 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:17:49 crc kubenswrapper[4722]: I0226 20:17:49.153022 4722 generic.go:334] "Generic (PLEG): container finished" podID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerID="ee2b72bbdef2561e9930658c500b9a220451de6db94cf1b7c41aabacee3b050f" exitCode=0 Feb 26 20:17:49 crc kubenswrapper[4722]: I0226 20:17:49.153050 4722 generic.go:334] "Generic (PLEG): container finished" podID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerID="f16c84134edede6b0f74dc071dbf831fd0e1b2de7490d40f91b3a09eacd448f7" exitCode=2 Feb 26 20:17:49 crc kubenswrapper[4722]: I0226 20:17:49.153809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerDied","Data":"ee2b72bbdef2561e9930658c500b9a220451de6db94cf1b7c41aabacee3b050f"} Feb 26 20:17:49 crc kubenswrapper[4722]: I0226 20:17:49.153840 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerDied","Data":"f16c84134edede6b0f74dc071dbf831fd0e1b2de7490d40f91b3a09eacd448f7"} Feb 26 20:17:50 crc kubenswrapper[4722]: I0226 20:17:50.163739 4722 generic.go:334] "Generic (PLEG): container finished" podID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerID="24fc979476066d121fc80e0768e9cb5de3e9300b88768c1694124cc55324abe8" exitCode=0 Feb 26 20:17:50 crc kubenswrapper[4722]: I0226 20:17:50.164085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerDied","Data":"24fc979476066d121fc80e0768e9cb5de3e9300b88768c1694124cc55324abe8"} Feb 26 20:17:51 crc kubenswrapper[4722]: I0226 20:17:51.208512 4722 generic.go:334] "Generic (PLEG): container finished" podID="f27e7d78-b723-43b0-8734-8892bd8cfd3b" containerID="68a6a8b3780fa7e785b92fc5772ce351150e3da25f1c0a02f33bce6d1f924c21" exitCode=0 Feb 26 20:17:51 crc kubenswrapper[4722]: I0226 20:17:51.208868 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-vfgst" event={"ID":"f27e7d78-b723-43b0-8734-8892bd8cfd3b","Type":"ContainerDied","Data":"68a6a8b3780fa7e785b92fc5772ce351150e3da25f1c0a02f33bce6d1f924c21"} Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.766983 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a913d767-5243-448d-b5e9-6112a27b6233" containerName="rabbitmq" containerID="cri-o://2e092e8d10162bdb0dd3f0ee5451b265ef3008a8fdd0ffdf127ad0130ba308a2" gracePeriod=604796 Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.783674 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.944534 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-combined-ca-bundle\") pod \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.944619 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxlmf\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-kube-api-access-wxlmf\") pod \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.944692 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-config-data\") pod \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.944792 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-scripts\") pod \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.944843 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-certs\") pod \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.950095 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-scripts" (OuterVolumeSpecName: "scripts") pod "f27e7d78-b723-43b0-8734-8892bd8cfd3b" (UID: "f27e7d78-b723-43b0-8734-8892bd8cfd3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.963458 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-kube-api-access-wxlmf" (OuterVolumeSpecName: "kube-api-access-wxlmf") pod "f27e7d78-b723-43b0-8734-8892bd8cfd3b" (UID: "f27e7d78-b723-43b0-8734-8892bd8cfd3b"). InnerVolumeSpecName "kube-api-access-wxlmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.965189 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-certs" (OuterVolumeSpecName: "certs") pod "f27e7d78-b723-43b0-8734-8892bd8cfd3b" (UID: "f27e7d78-b723-43b0-8734-8892bd8cfd3b"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.984100 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f27e7d78-b723-43b0-8734-8892bd8cfd3b" (UID: "f27e7d78-b723-43b0-8734-8892bd8cfd3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.993303 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-config-data" (OuterVolumeSpecName: "config-data") pod "f27e7d78-b723-43b0-8734-8892bd8cfd3b" (UID: "f27e7d78-b723-43b0-8734-8892bd8cfd3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.047067 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.047100 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxlmf\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-kube-api-access-wxlmf\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.047112 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.047122 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.047131 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.232499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-vfgst" event={"ID":"f27e7d78-b723-43b0-8734-8892bd8cfd3b","Type":"ContainerDied","Data":"b1e7398a0aeed767314609dfa8528a8e1d673b28a3d00b593910ef737cac1235"} Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.232723 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1e7398a0aeed767314609dfa8528a8e1d673b28a3d00b593910ef737cac1235" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.232525 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.234491 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerID="407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020" exitCode=0 Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.234536 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerDied","Data":"407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020"} Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.280500 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerName="rabbitmq" containerID="cri-o://df270729411ef9e7833235443490c726efde57815635ab30de2a17139899505d" gracePeriod=604796 Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.303017 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-f7nmr"] Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.311926 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-f7nmr"] Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.415876 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-g6wlr"] Feb 26 20:17:53 crc kubenswrapper[4722]: E0226 20:17:53.416513 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27e7d78-b723-43b0-8734-8892bd8cfd3b" containerName="cloudkitty-db-sync" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.416585 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27e7d78-b723-43b0-8734-8892bd8cfd3b" containerName="cloudkitty-db-sync" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.416867 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27e7d78-b723-43b0-8734-8892bd8cfd3b" containerName="cloudkitty-db-sync" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.417714 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.420102 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.428826 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-g6wlr"] Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.559537 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-config-data\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.559783 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmt85\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-kube-api-access-bmt85\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.559889 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-scripts\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.559956 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-certs\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.559994 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-combined-ca-bundle\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.661615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-scripts\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.661678 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-certs\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.661703 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-combined-ca-bundle\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.661903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-config-data\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.661964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmt85\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-kube-api-access-bmt85\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.669576 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-scripts\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.669759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-certs\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.675314 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-combined-ca-bundle\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.680060 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-config-data\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.691610 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmt85\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-kube-api-access-bmt85\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.736013 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.158110 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e702637a-959c-4660-b2a0-dc4325119819" path="/var/lib/kubelet/pods/e702637a-959c-4660-b2a0-dc4325119819/volumes" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.245746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-g6wlr"] Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.247901 4722 generic.go:334] "Generic (PLEG): container finished" podID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerID="5c9fde1c70c575ff37ad6cbf6bc8b09e97f9f6cf1d0a61af87a62fcf2de950d0" exitCode=0 Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.247976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerDied","Data":"5c9fde1c70c575ff37ad6cbf6bc8b09e97f9f6cf1d0a61af87a62fcf2de950d0"} Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.251037 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerStarted","Data":"b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3"} Feb 26 20:17:54 crc kubenswrapper[4722]: W0226 20:17:54.257833 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba97d95_3c78_4be9_93d6_3654f3ad8cd6.slice/crio-b29fd0bdc2bfe54dc008631bdafed08c248c93780b429e90ed6ae8bf8362b5ea WatchSource:0}: Error finding container b29fd0bdc2bfe54dc008631bdafed08c248c93780b429e90ed6ae8bf8362b5ea: Status 404 returned error can't find the container with id b29fd0bdc2bfe54dc008631bdafed08c248c93780b429e90ed6ae8bf8362b5ea Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.289332 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xrhct" podStartSLOduration=2.766738792 podStartE2EDuration="10.289291596s" podCreationTimestamp="2026-02-26 20:17:44 +0000 UTC" firstStartedPulling="2026-02-26 20:17:46.102305399 +0000 UTC m=+1408.639273353" lastFinishedPulling="2026-02-26 20:17:53.624858233 +0000 UTC m=+1416.161826157" observedRunningTime="2026-02-26 20:17:54.275461984 +0000 UTC m=+1416.812429918" watchObservedRunningTime="2026-02-26 20:17:54.289291596 +0000 UTC m=+1416.826259520" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.446655 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.600510 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-config-data\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.600672 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwnrb\" (UniqueName: \"kubernetes.io/projected/97329a8f-4016-43a9-8589-ee3c1b05aacb-kube-api-access-pwnrb\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.600717 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-ceilometer-tls-certs\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.600767 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-scripts\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.600869 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-run-httpd\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.600918 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-combined-ca-bundle\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.601014 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-log-httpd\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.601053 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-sg-core-conf-yaml\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.607819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-scripts" (OuterVolumeSpecName: "scripts") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.608384 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.611357 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.634245 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97329a8f-4016-43a9-8589-ee3c1b05aacb-kube-api-access-pwnrb" (OuterVolumeSpecName: "kube-api-access-pwnrb") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "kube-api-access-pwnrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.634374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.659244 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.703325 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwnrb\" (UniqueName: \"kubernetes.io/projected/97329a8f-4016-43a9-8589-ee3c1b05aacb-kube-api-access-pwnrb\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.703354 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.703365 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.703374 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.703383 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.703391 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.717715 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.755640 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.755702 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.760896 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-config-data" (OuterVolumeSpecName: "config-data") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.807601 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.807911 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.267696 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g6wlr" event={"ID":"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6","Type":"ContainerStarted","Data":"03a6b8da21e83ffb59c4cf805d29a8b5cf7140fdc5596ce0196a0f2cca17012d"} Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.267746 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g6wlr" event={"ID":"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6","Type":"ContainerStarted","Data":"b29fd0bdc2bfe54dc008631bdafed08c248c93780b429e90ed6ae8bf8362b5ea"} Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.275650 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.279509 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerDied","Data":"0c3827c3d265381feed608569950c52480366bdddc88bf26fde3808bcf8ea656"} Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.279587 4722 scope.go:117] "RemoveContainer" containerID="ee2b72bbdef2561e9930658c500b9a220451de6db94cf1b7c41aabacee3b050f" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.287248 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-g6wlr" podStartSLOduration=2.287232456 podStartE2EDuration="2.287232456s" podCreationTimestamp="2026-02-26 20:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:17:55.281191814 +0000 UTC m=+1417.818159738" watchObservedRunningTime="2026-02-26 20:17:55.287232456 +0000 UTC m=+1417.824200380" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.312504 4722 scope.go:117] "RemoveContainer" containerID="f16c84134edede6b0f74dc071dbf831fd0e1b2de7490d40f91b3a09eacd448f7" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.312683 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.321290 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.346830 4722 scope.go:117] "RemoveContainer" containerID="5c9fde1c70c575ff37ad6cbf6bc8b09e97f9f6cf1d0a61af87a62fcf2de950d0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.352990 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:17:55 crc kubenswrapper[4722]: E0226 20:17:55.355406 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="sg-core" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355438 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="sg-core" Feb 26 20:17:55 crc kubenswrapper[4722]: E0226 20:17:55.355459 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-central-agent" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355466 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-central-agent" Feb 26 20:17:55 crc kubenswrapper[4722]: E0226 20:17:55.355478 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="proxy-httpd" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355485 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="proxy-httpd" Feb 26 20:17:55 crc kubenswrapper[4722]: E0226 20:17:55.355529 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-notification-agent" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355537 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-notification-agent" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355903 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-central-agent" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355917 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="sg-core" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355934 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-notification-agent" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355948 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="proxy-httpd" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.357913 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.361965 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.362173 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.362321 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.387768 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.399958 4722 scope.go:117] "RemoveContainer" containerID="24fc979476066d121fc80e0768e9cb5de3e9300b88768c1694124cc55324abe8" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.521546 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-log-httpd\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.521983 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.522036 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-scripts\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.522078 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-config-data\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.522186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zzsb\" (UniqueName: \"kubernetes.io/projected/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-kube-api-access-2zzsb\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.522276 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.522333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.522391 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-run-httpd\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.624733 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.624820 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.624878 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-run-httpd\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.624907 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-log-httpd\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.624962 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.625005 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-scripts\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.625026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-config-data\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.625100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zzsb\" (UniqueName: \"kubernetes.io/projected/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-kube-api-access-2zzsb\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.626046 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-log-httpd\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.626251 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-run-httpd\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.631612 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.631935 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.632130 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-config-data\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.632251 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-scripts\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.634128 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.643355 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zzsb\" (UniqueName: \"kubernetes.io/projected/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-kube-api-access-2zzsb\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.685567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.817738 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xrhct" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" probeResult="failure" output=< Feb 26 20:17:55 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:17:55 crc kubenswrapper[4722]: > Feb 26 20:17:56 crc kubenswrapper[4722]: I0226 20:17:56.157400 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" path="/var/lib/kubelet/pods/97329a8f-4016-43a9-8589-ee3c1b05aacb/volumes" Feb 26 20:17:56 crc kubenswrapper[4722]: W0226 20:17:56.314436 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda07fb793_d2c8_4d0a_b04e_b6e4476f370c.slice/crio-9f68f7a0fcd5f7235ef80621d2cc7ba0bf4f0a5426f83d3a3ddff5656f8328e0 WatchSource:0}: Error finding container 9f68f7a0fcd5f7235ef80621d2cc7ba0bf4f0a5426f83d3a3ddff5656f8328e0: Status 404 returned error can't find the container with id 9f68f7a0fcd5f7235ef80621d2cc7ba0bf4f0a5426f83d3a3ddff5656f8328e0 Feb 26 20:17:56 crc kubenswrapper[4722]: I0226 20:17:56.328405 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:17:57 crc kubenswrapper[4722]: I0226 20:17:57.299113 4722 generic.go:334] "Generic (PLEG): container finished" podID="1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" containerID="03a6b8da21e83ffb59c4cf805d29a8b5cf7140fdc5596ce0196a0f2cca17012d" exitCode=0 Feb 26 20:17:57 crc kubenswrapper[4722]: I0226 20:17:57.299355 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g6wlr" event={"ID":"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6","Type":"ContainerDied","Data":"03a6b8da21e83ffb59c4cf805d29a8b5cf7140fdc5596ce0196a0f2cca17012d"} Feb 26 20:17:57 crc kubenswrapper[4722]: I0226 20:17:57.302321 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a07fb793-d2c8-4d0a-b04e-b6e4476f370c","Type":"ContainerStarted","Data":"9f68f7a0fcd5f7235ef80621d2cc7ba0bf4f0a5426f83d3a3ddff5656f8328e0"} Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.332652 4722 generic.go:334] "Generic (PLEG): container finished" podID="a913d767-5243-448d-b5e9-6112a27b6233" containerID="2e092e8d10162bdb0dd3f0ee5451b265ef3008a8fdd0ffdf127ad0130ba308a2" exitCode=0 Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.332739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a913d767-5243-448d-b5e9-6112a27b6233","Type":"ContainerDied","Data":"2e092e8d10162bdb0dd3f0ee5451b265ef3008a8fdd0ffdf127ad0130ba308a2"} Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.448352 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.634111 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-config-data\") pod \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.634187 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmt85\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-kube-api-access-bmt85\") pod \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.634212 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-certs\") pod \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.634323 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-combined-ca-bundle\") pod \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.634371 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-scripts\") pod \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.645592 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-certs" (OuterVolumeSpecName: "certs") pod "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" (UID: "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.645809 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-kube-api-access-bmt85" (OuterVolumeSpecName: "kube-api-access-bmt85") pod "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" (UID: "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6"). InnerVolumeSpecName "kube-api-access-bmt85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.647631 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-scripts" (OuterVolumeSpecName: "scripts") pod "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" (UID: "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.666280 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-config-data" (OuterVolumeSpecName: "config-data") pod "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" (UID: "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.678662 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" (UID: "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.736508 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.736570 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmt85\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-kube-api-access-bmt85\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.736584 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.736592 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.736600 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.131659 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535618-q6vg5"] Feb 26 20:18:00 crc kubenswrapper[4722]: E0226 20:18:00.132588 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" containerName="cloudkitty-storageinit" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.132686 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" containerName="cloudkitty-storageinit" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.132972 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" containerName="cloudkitty-storageinit" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.133814 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.136754 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.137013 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.137829 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.184382 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535618-q6vg5"] Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.249873 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9qvs\" (UniqueName: \"kubernetes.io/projected/12e9c803-fc70-41f2-83a2-23e6917fa381-kube-api-access-d9qvs\") pod \"auto-csr-approver-29535618-q6vg5\" (UID: \"12e9c803-fc70-41f2-83a2-23e6917fa381\") " pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.351435 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9qvs\" (UniqueName: \"kubernetes.io/projected/12e9c803-fc70-41f2-83a2-23e6917fa381-kube-api-access-d9qvs\") pod \"auto-csr-approver-29535618-q6vg5\" (UID: \"12e9c803-fc70-41f2-83a2-23e6917fa381\") " pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.354814 4722 generic.go:334] "Generic (PLEG): container finished" podID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerID="df270729411ef9e7833235443490c726efde57815635ab30de2a17139899505d" exitCode=0 Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.354885 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b02241f-513e-4558-b519-5bd84e5b4eff","Type":"ContainerDied","Data":"df270729411ef9e7833235443490c726efde57815635ab30de2a17139899505d"} Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.354913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b02241f-513e-4558-b519-5bd84e5b4eff","Type":"ContainerDied","Data":"6a2fa971868db01bdd112f4eb1e3489ba1f5fe053897c45af297698ae106632b"} Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.354926 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a2fa971868db01bdd112f4eb1e3489ba1f5fe053897c45af297698ae106632b" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.360556 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.361494 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g6wlr" event={"ID":"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6","Type":"ContainerDied","Data":"b29fd0bdc2bfe54dc008631bdafed08c248c93780b429e90ed6ae8bf8362b5ea"} Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.361519 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b29fd0bdc2bfe54dc008631bdafed08c248c93780b429e90ed6ae8bf8362b5ea" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.366275 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a07fb793-d2c8-4d0a-b04e-b6e4476f370c","Type":"ContainerStarted","Data":"63b983d6c6ea0eedeeb24dc7559a1d55d9a8cd77f476c6c0652fbc23d9723478"} Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.370182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9qvs\" (UniqueName: \"kubernetes.io/projected/12e9c803-fc70-41f2-83a2-23e6917fa381-kube-api-access-d9qvs\") pod \"auto-csr-approver-29535618-q6vg5\" (UID: \"12e9c803-fc70-41f2-83a2-23e6917fa381\") " pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.438456 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.445078 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.452841 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.560357 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a913d767-5243-448d-b5e9-6112a27b6233-pod-info\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.560559 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-config-data\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.560691 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-tls\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.560768 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-plugins-conf\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.560864 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-plugins-conf\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.560956 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-confd\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.561023 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-erlang-cookie\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.561094 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-plugins\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.561194 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b02241f-513e-4558-b519-5bd84e5b4eff-erlang-cookie-secret\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.561301 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-plugins\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.561433 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-server-conf\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.561532 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-config-data\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.566747 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.574259 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.576198 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-server-conf\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.578700 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.578758 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llfpl\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-kube-api-access-llfpl\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.578785 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-erlang-cookie\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.582451 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.582931 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.584519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.586478 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.588985 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.590850 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.590946 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9xwm\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-kube-api-access-h9xwm\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.591008 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-tls\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.591049 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a913d767-5243-448d-b5e9-6112a27b6233-erlang-cookie-secret\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.591094 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b02241f-513e-4558-b519-5bd84e5b4eff-pod-info\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.591123 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-confd\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592027 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592044 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592052 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592076 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592087 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592098 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592110 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.659806 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3b02241f-513e-4558-b519-5bd84e5b4eff-pod-info" (OuterVolumeSpecName: "pod-info") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.668314 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-kube-api-access-llfpl" (OuterVolumeSpecName: "kube-api-access-llfpl") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "kube-api-access-llfpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.670706 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b02241f-513e-4558-b519-5bd84e5b4eff-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.674179 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a913d767-5243-448d-b5e9-6112a27b6233-pod-info" (OuterVolumeSpecName: "pod-info") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.680724 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-kube-api-access-h9xwm" (OuterVolumeSpecName: "kube-api-access-h9xwm") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "kube-api-access-h9xwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.683347 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a913d767-5243-448d-b5e9-6112a27b6233-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.684797 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717417 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llfpl\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-kube-api-access-llfpl\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717445 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9xwm\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-kube-api-access-h9xwm\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717456 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717467 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a913d767-5243-448d-b5e9-6112a27b6233-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717475 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b02241f-513e-4558-b519-5bd84e5b4eff-pod-info\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717483 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a913d767-5243-448d-b5e9-6112a27b6233-pod-info\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717491 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b02241f-513e-4558-b519-5bd84e5b4eff-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.735118 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.735366 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="e0705108-f020-43bc-a1af-7edae5a50927" containerName="cloudkitty-proc" containerID="cri-o://f0112e661e47e20ef19a44e450ed3d76c809cd6c2ccded0507b6351eec466cad" gracePeriod=30 Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.768444 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-config-data" (OuterVolumeSpecName: "config-data") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.802963 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-server-conf" (OuterVolumeSpecName: "server-conf") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.823240 4722 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-server-conf\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.823743 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.841067 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.841315 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api-log" containerID="cri-o://068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e" gracePeriod=30 Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.841452 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api" containerID="cri-o://a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644" gracePeriod=30 Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.888064 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-config-data" (OuterVolumeSpecName: "config-data") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.928498 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.932823 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.046898 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-server-conf" (OuterVolumeSpecName: "server-conf") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.054083 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.054115 4722 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-server-conf\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.165653 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.175398 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45" (OuterVolumeSpecName: "persistence") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.183216 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f" (OuterVolumeSpecName: "persistence") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:18:01 crc kubenswrapper[4722]: E0226 20:18:01.221651 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52c8d648_e7a4_40c9_8db8_a8f5e4007d31.slice/crio-068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.273118 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.273179 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") on node \"crc\" " Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.273199 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") on node \"crc\" " Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.308929 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.309554 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45") on node "crc" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.309868 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.310439 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f") on node "crc" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.380781 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.380820 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.406258 4722 generic.go:334] "Generic (PLEG): container finished" podID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerID="068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e" exitCode=143 Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.406546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52c8d648-e7a4-40c9-8db8-a8f5e4007d31","Type":"ContainerDied","Data":"068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e"} Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.417085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a913d767-5243-448d-b5e9-6112a27b6233","Type":"ContainerDied","Data":"5221b51bb2dbaccdd7dd4d846badf69d17780392d0617953086df303d0ca64d3"} Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.417152 4722 scope.go:117] "RemoveContainer" containerID="2e092e8d10162bdb0dd3f0ee5451b265ef3008a8fdd0ffdf127ad0130ba308a2" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.417377 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.456752 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.457875 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a07fb793-d2c8-4d0a-b04e-b6e4476f370c","Type":"ContainerStarted","Data":"b538376b1060b7bb31c7a7fb9f043cc431a3deb9daf08013c8709a114d5ffb41"} Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.468751 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535618-q6vg5"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.531268 4722 scope.go:117] "RemoveContainer" containerID="43ea159df0e961d5bba20f73c2ccb16ed052423970ff1d6e49f9d35103353227" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.553085 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.566273 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.574540 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: E0226 20:18:01.575004 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerName="setup-container" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.575016 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerName="setup-container" Feb 26 20:18:01 crc kubenswrapper[4722]: E0226 20:18:01.575027 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerName="rabbitmq" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.575033 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerName="rabbitmq" Feb 26 20:18:01 crc kubenswrapper[4722]: E0226 20:18:01.575045 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a913d767-5243-448d-b5e9-6112a27b6233" containerName="rabbitmq" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.575051 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a913d767-5243-448d-b5e9-6112a27b6233" containerName="rabbitmq" Feb 26 20:18:01 crc kubenswrapper[4722]: E0226 20:18:01.575063 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a913d767-5243-448d-b5e9-6112a27b6233" containerName="setup-container" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.575068 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a913d767-5243-448d-b5e9-6112a27b6233" containerName="setup-container" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.575270 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a913d767-5243-448d-b5e9-6112a27b6233" containerName="rabbitmq" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.575286 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerName="rabbitmq" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.576470 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.580784 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.580983 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dspkw" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.581172 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.581408 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.581519 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.581636 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.582248 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590423 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rmwn\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-kube-api-access-7rmwn\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590461 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590486 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/796c5930-3ba4-4795-88f0-2e85145f3c85-pod-info\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590549 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590585 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-server-conf\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590615 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590696 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590739 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590760 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-config-data\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590791 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/796c5930-3ba4-4795-88f0-2e85145f3c85-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.598206 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.625648 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.646004 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.648017 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.650300 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.650595 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.650754 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.652002 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.652265 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.656422 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.656860 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mrr5c" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.664743 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.676746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.692275 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.692463 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-server-conf\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.692581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.692675 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.693706 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.693932 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.694081 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-config-data\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.694842 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/796c5930-3ba4-4795-88f0-2e85145f3c85-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.694973 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rmwn\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-kube-api-access-7rmwn\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.695044 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.695159 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/796c5930-3ba4-4795-88f0-2e85145f3c85-pod-info\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.693179 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.694589 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.699534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/796c5930-3ba4-4795-88f0-2e85145f3c85-pod-info\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.699710 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.699745 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd14e19774d71f109a19171e3fc1d26ffc39fb374e187e66a1dc69515e8b6e48/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.700180 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-server-conf\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.702081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.710252 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-config-data\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.720001 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.721407 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/796c5930-3ba4-4795-88f0-2e85145f3c85-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.724595 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.734440 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rmwn\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-kube-api-access-7rmwn\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801626 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801676 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801729 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801828 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801849 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3bb51c2-ceca-4301-82cb-959028030d58-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801905 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3bb51c2-ceca-4301-82cb-959028030d58-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801927 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801949 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jdlj\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-kube-api-access-5jdlj\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801987 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.802092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.818411 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.885711 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-595979776c-nrnx7"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.888983 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.897103 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.903848 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.903972 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904004 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904024 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904062 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904110 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3bb51c2-ceca-4301-82cb-959028030d58-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904129 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904203 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3bb51c2-ceca-4301-82cb-959028030d58-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904228 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904252 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jdlj\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-kube-api-access-5jdlj\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.905933 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.907630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.908689 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.915286 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.915472 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-595979776c-nrnx7"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.915991 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.918988 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.930724 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.930765 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fcc038ee7f96188050e1013bbe01ce8f5883fc8f59481375757326e8cc4a362e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.935108 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.935571 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3bb51c2-ceca-4301-82cb-959028030d58-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.935942 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3bb51c2-ceca-4301-82cb-959028030d58-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.939642 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jdlj\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-kube-api-access-5jdlj\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.954177 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.001761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006314 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-nb\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-swift-storage-0\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006718 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-config\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-openstack-edpm-ipam\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006874 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-svc\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-sb\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006945 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-kube-api-access-2jfm9\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.080320 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110179 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-kube-api-access-2jfm9\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110621 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-nb\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110667 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-swift-storage-0\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110777 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-config\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110810 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-openstack-edpm-ipam\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110837 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-svc\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-sb\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.111651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-sb\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.114341 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-nb\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.115710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-config\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.117884 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-swift-storage-0\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.119085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-svc\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.119608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-openstack-edpm-ipam\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.129972 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-kube-api-access-2jfm9\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.167084 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" path="/var/lib/kubelet/pods/3b02241f-513e-4558-b519-5bd84e5b4eff/volumes" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.169064 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a913d767-5243-448d-b5e9-6112a27b6233" path="/var/lib/kubelet/pods/a913d767-5243-448d-b5e9-6112a27b6233/volumes" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.243683 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.487443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" event={"ID":"12e9c803-fc70-41f2-83a2-23e6917fa381","Type":"ContainerStarted","Data":"488a585855ba18bc04fc4781c135cdc2abd6e190ad0ef9059dedda6fe7d4f5e1"} Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.492682 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a07fb793-d2c8-4d0a-b04e-b6e4476f370c","Type":"ContainerStarted","Data":"738836bb8256813c7e2123bd178e44d3d34d30760368ff177dee04e65329f23f"} Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.497697 4722 generic.go:334] "Generic (PLEG): container finished" podID="e0705108-f020-43bc-a1af-7edae5a50927" containerID="f0112e661e47e20ef19a44e450ed3d76c809cd6c2ccded0507b6351eec466cad" exitCode=0 Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.497749 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e0705108-f020-43bc-a1af-7edae5a50927","Type":"ContainerDied","Data":"f0112e661e47e20ef19a44e450ed3d76c809cd6c2ccded0507b6351eec466cad"} Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.555332 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:18:02 crc kubenswrapper[4722]: W0226 20:18:02.617270 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod796c5930_3ba4_4795_88f0_2e85145f3c85.slice/crio-39a49633dc27fbbcfaa3db22b4fddea1e44f25891b24f856fe375d8b622ca3d7 WatchSource:0}: Error finding container 39a49633dc27fbbcfaa3db22b4fddea1e44f25891b24f856fe375d8b622ca3d7: Status 404 returned error can't find the container with id 39a49633dc27fbbcfaa3db22b4fddea1e44f25891b24f856fe375d8b622ca3d7 Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.634324 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.635763 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data\") pod \"e0705108-f020-43bc-a1af-7edae5a50927\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.635849 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-certs\") pod \"e0705108-f020-43bc-a1af-7edae5a50927\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.635960 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-combined-ca-bundle\") pod \"e0705108-f020-43bc-a1af-7edae5a50927\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.636463 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data-custom\") pod \"e0705108-f020-43bc-a1af-7edae5a50927\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.636503 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn9z8\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-kube-api-access-xn9z8\") pod \"e0705108-f020-43bc-a1af-7edae5a50927\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.636523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-scripts\") pod \"e0705108-f020-43bc-a1af-7edae5a50927\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.640701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-certs" (OuterVolumeSpecName: "certs") pod "e0705108-f020-43bc-a1af-7edae5a50927" (UID: "e0705108-f020-43bc-a1af-7edae5a50927"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.642884 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.652556 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e0705108-f020-43bc-a1af-7edae5a50927" (UID: "e0705108-f020-43bc-a1af-7edae5a50927"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.658323 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-kube-api-access-xn9z8" (OuterVolumeSpecName: "kube-api-access-xn9z8") pod "e0705108-f020-43bc-a1af-7edae5a50927" (UID: "e0705108-f020-43bc-a1af-7edae5a50927"). InnerVolumeSpecName "kube-api-access-xn9z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.659689 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-scripts" (OuterVolumeSpecName: "scripts") pod "e0705108-f020-43bc-a1af-7edae5a50927" (UID: "e0705108-f020-43bc-a1af-7edae5a50927"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.678010 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data" (OuterVolumeSpecName: "config-data") pod "e0705108-f020-43bc-a1af-7edae5a50927" (UID: "e0705108-f020-43bc-a1af-7edae5a50927"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.729943 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0705108-f020-43bc-a1af-7edae5a50927" (UID: "e0705108-f020-43bc-a1af-7edae5a50927"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.745691 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.746567 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.746681 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.746740 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn9z8\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-kube-api-access-xn9z8\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.746797 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.759839 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.888422 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-595979776c-nrnx7"] Feb 26 20:18:02 crc kubenswrapper[4722]: W0226 20:18:02.937087 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fb7fb48_09f9_4e86_9d51_a56d0d2cebda.slice/crio-f97c987ca915ba748dfc3ebd04fdebcc1fd45eed70119f2de808019def610661 WatchSource:0}: Error finding container f97c987ca915ba748dfc3ebd04fdebcc1fd45eed70119f2de808019def610661: Status 404 returned error can't find the container with id f97c987ca915ba748dfc3ebd04fdebcc1fd45eed70119f2de808019def610661 Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.269295 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361335 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-public-tls-certs\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361372 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-logs\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361398 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-combined-ca-bundle\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361600 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-internal-tls-certs\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361654 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361669 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-scripts\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361715 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data-custom\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361749 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-certs\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361780 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pstwf\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-kube-api-access-pstwf\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.364195 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-logs" (OuterVolumeSpecName: "logs") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.371567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.374295 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-certs" (OuterVolumeSpecName: "certs") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.374337 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-kube-api-access-pstwf" (OuterVolumeSpecName: "kube-api-access-pstwf") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "kube-api-access-pstwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.374365 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-scripts" (OuterVolumeSpecName: "scripts") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.406378 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data" (OuterVolumeSpecName: "config-data") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.425296 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.453686 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.465275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466509 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466542 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466552 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pstwf\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-kube-api-access-pstwf\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466562 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466570 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466578 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466587 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466598 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466605 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.508681 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e0705108-f020-43bc-a1af-7edae5a50927","Type":"ContainerDied","Data":"d984510d4fd3fa39c044bbc7baf8be1f9033dbe04210ca783df64e4685010a74"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.508749 4722 scope.go:117] "RemoveContainer" containerID="f0112e661e47e20ef19a44e450ed3d76c809cd6c2ccded0507b6351eec466cad" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.508862 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.514586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"796c5930-3ba4-4795-88f0-2e85145f3c85","Type":"ContainerStarted","Data":"39a49633dc27fbbcfaa3db22b4fddea1e44f25891b24f856fe375d8b622ca3d7"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.517396 4722 generic.go:334] "Generic (PLEG): container finished" podID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerID="a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644" exitCode=0 Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.517440 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52c8d648-e7a4-40c9-8db8-a8f5e4007d31","Type":"ContainerDied","Data":"a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.517459 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52c8d648-e7a4-40c9-8db8-a8f5e4007d31","Type":"ContainerDied","Data":"fc5320da3d9a270e99a8cf10b9849b44fb32d59bacc00c14b98e2cdd4eb56b17"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.517509 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.523352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3bb51c2-ceca-4301-82cb-959028030d58","Type":"ContainerStarted","Data":"c7f181bdf1a658e8adf48f37d74fb12fc2345a8ca4834825a8be1762cec08478"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.525151 4722 generic.go:334] "Generic (PLEG): container finished" podID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerID="fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f" exitCode=0 Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.525198 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-595979776c-nrnx7" event={"ID":"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda","Type":"ContainerDied","Data":"fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.525223 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-595979776c-nrnx7" event={"ID":"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda","Type":"ContainerStarted","Data":"f97c987ca915ba748dfc3ebd04fdebcc1fd45eed70119f2de808019def610661"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.535562 4722 generic.go:334] "Generic (PLEG): container finished" podID="12e9c803-fc70-41f2-83a2-23e6917fa381" containerID="7309364f193d7a19f0dbaf783411010ec7045e2d297bf72c99927634ee426f63" exitCode=0 Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.535601 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" event={"ID":"12e9c803-fc70-41f2-83a2-23e6917fa381","Type":"ContainerDied","Data":"7309364f193d7a19f0dbaf783411010ec7045e2d297bf72c99927634ee426f63"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.913498 4722 scope.go:117] "RemoveContainer" containerID="a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.937418 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.953840 4722 scope.go:117] "RemoveContainer" containerID="068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.964176 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.976070 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.982212 4722 scope.go:117] "RemoveContainer" containerID="a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644" Feb 26 20:18:03 crc kubenswrapper[4722]: E0226 20:18:03.983254 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644\": container with ID starting with a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644 not found: ID does not exist" containerID="a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.983298 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644"} err="failed to get container status \"a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644\": rpc error: code = NotFound desc = could not find container \"a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644\": container with ID starting with a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644 not found: ID does not exist" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.983324 4722 scope.go:117] "RemoveContainer" containerID="068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e" Feb 26 20:18:03 crc kubenswrapper[4722]: E0226 20:18:03.983558 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e\": container with ID starting with 068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e not found: ID does not exist" containerID="068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.983580 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e"} err="failed to get container status \"068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e\": rpc error: code = NotFound desc = could not find container \"068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e\": container with ID starting with 068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e not found: ID does not exist" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.004273 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:18:04 crc kubenswrapper[4722]: E0226 20:18:04.004801 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.004822 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api" Feb 26 20:18:04 crc kubenswrapper[4722]: E0226 20:18:04.004838 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0705108-f020-43bc-a1af-7edae5a50927" containerName="cloudkitty-proc" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.004845 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0705108-f020-43bc-a1af-7edae5a50927" containerName="cloudkitty-proc" Feb 26 20:18:04 crc kubenswrapper[4722]: E0226 20:18:04.004880 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api-log" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.004887 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api-log" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.005095 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0705108-f020-43bc-a1af-7edae5a50927" containerName="cloudkitty-proc" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.005157 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api-log" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.005177 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.005910 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.013100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.013373 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-k7xwb" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.013486 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.013592 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.013699 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.022960 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.032869 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.050227 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.051948 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.054631 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.054938 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.055796 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.071705 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.177092 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" path="/var/lib/kubelet/pods/52c8d648-e7a4-40c9-8db8-a8f5e4007d31/volumes" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.177737 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0705108-f020-43bc-a1af-7edae5a50927" path="/var/lib/kubelet/pods/e0705108-f020-43bc-a1af-7edae5a50927/volumes" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.179111 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fb8d392-1263-4049-bb26-f832cc4526e1-logs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180523 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180571 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180595 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-config-data\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/73cc9447-4501-43ec-9f4a-2e406341ee16-certs\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-scripts\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c9sm\" (UniqueName: \"kubernetes.io/projected/73cc9447-4501-43ec-9f4a-2e406341ee16-kube-api-access-7c9sm\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180917 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zkxd\" (UniqueName: \"kubernetes.io/projected/8fb8d392-1263-4049-bb26-f832cc4526e1-kube-api-access-2zkxd\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180986 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.181020 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8fb8d392-1263-4049-bb26-f832cc4526e1-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.181035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.181074 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-config-data\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.181098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-scripts\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.181147 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.282861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c9sm\" (UniqueName: \"kubernetes.io/projected/73cc9447-4501-43ec-9f4a-2e406341ee16-kube-api-access-7c9sm\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.282902 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zkxd\" (UniqueName: \"kubernetes.io/projected/8fb8d392-1263-4049-bb26-f832cc4526e1-kube-api-access-2zkxd\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.282927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-scripts\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.282966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.283011 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8fb8d392-1263-4049-bb26-f832cc4526e1-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.283918 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.283959 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-config-data\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.283986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-scripts\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284037 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284124 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fb8d392-1263-4049-bb26-f832cc4526e1-logs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284735 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-config-data\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284781 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/73cc9447-4501-43ec-9f4a-2e406341ee16-certs\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284804 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fb8d392-1263-4049-bb26-f832cc4526e1-logs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.290123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8fb8d392-1263-4049-bb26-f832cc4526e1-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.293282 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-scripts\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.293327 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-config-data\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.293334 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.293814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-config-data\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.294063 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.294106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-scripts\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.294485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.294760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.295112 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.295824 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.296410 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/73cc9447-4501-43ec-9f4a-2e406341ee16-certs\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.302666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c9sm\" (UniqueName: \"kubernetes.io/projected/73cc9447-4501-43ec-9f4a-2e406341ee16-kube-api-access-7c9sm\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.303671 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zkxd\" (UniqueName: \"kubernetes.io/projected/8fb8d392-1263-4049-bb26-f832cc4526e1-kube-api-access-2zkxd\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.406890 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.420989 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.567795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a07fb793-d2c8-4d0a-b04e-b6e4476f370c","Type":"ContainerStarted","Data":"b9cd59cbb5f059580dd1a5085673c98b686d228e3aef5315d7aeb75a57d2120d"} Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.568814 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.573612 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"796c5930-3ba4-4795-88f0-2e85145f3c85","Type":"ContainerStarted","Data":"d862b3d0b6db6d8fa6fba2930f0b699cba18d261cae7f637906794821f02217a"} Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.576583 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3bb51c2-ceca-4301-82cb-959028030d58","Type":"ContainerStarted","Data":"b7d463edaca0feb6f8edb087f6ca6812811dc253ae02d467612f89ce80906ad7"} Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.580204 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-595979776c-nrnx7" event={"ID":"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda","Type":"ContainerStarted","Data":"9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d"} Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.599865 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9045827480000002 podStartE2EDuration="9.599846519s" podCreationTimestamp="2026-02-26 20:17:55 +0000 UTC" firstStartedPulling="2026-02-26 20:17:56.317194359 +0000 UTC m=+1418.854162283" lastFinishedPulling="2026-02-26 20:18:04.01245813 +0000 UTC m=+1426.549426054" observedRunningTime="2026-02-26 20:18:04.593040676 +0000 UTC m=+1427.130008620" watchObservedRunningTime="2026-02-26 20:18:04.599846519 +0000 UTC m=+1427.136814453" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.656081 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-595979776c-nrnx7" podStartSLOduration=3.656062092 podStartE2EDuration="3.656062092s" podCreationTimestamp="2026-02-26 20:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:18:04.63555029 +0000 UTC m=+1427.172518224" watchObservedRunningTime="2026-02-26 20:18:04.656062092 +0000 UTC m=+1427.193030016" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.999550 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.131161 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:18:05 crc kubenswrapper[4722]: W0226 20:18:05.137090 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fb8d392_1263_4049_bb26_f832cc4526e1.slice/crio-cb06e90e85fb7e2d60a44c189c11a2267b9560f520d8bc91862747df845a851c WatchSource:0}: Error finding container cb06e90e85fb7e2d60a44c189c11a2267b9560f520d8bc91862747df845a851c: Status 404 returned error can't find the container with id cb06e90e85fb7e2d60a44c189c11a2267b9560f520d8bc91862747df845a851c Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.140389 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.203884 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9qvs\" (UniqueName: \"kubernetes.io/projected/12e9c803-fc70-41f2-83a2-23e6917fa381-kube-api-access-d9qvs\") pod \"12e9c803-fc70-41f2-83a2-23e6917fa381\" (UID: \"12e9c803-fc70-41f2-83a2-23e6917fa381\") " Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.213644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e9c803-fc70-41f2-83a2-23e6917fa381-kube-api-access-d9qvs" (OuterVolumeSpecName: "kube-api-access-d9qvs") pod "12e9c803-fc70-41f2-83a2-23e6917fa381" (UID: "12e9c803-fc70-41f2-83a2-23e6917fa381"). InnerVolumeSpecName "kube-api-access-d9qvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.307360 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9qvs\" (UniqueName: \"kubernetes.io/projected/12e9c803-fc70-41f2-83a2-23e6917fa381-kube-api-access-d9qvs\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.591264 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"73cc9447-4501-43ec-9f4a-2e406341ee16","Type":"ContainerStarted","Data":"cabcad7109afe9d91ad1a0ebaaf4293260f4110d5b89106ec1975292413eddc9"} Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.591326 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"73cc9447-4501-43ec-9f4a-2e406341ee16","Type":"ContainerStarted","Data":"da0977003eeccd2a53d57cefbf219878f2423e6bdf84cd6ee6dca59416ed2be4"} Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.595936 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"8fb8d392-1263-4049-bb26-f832cc4526e1","Type":"ContainerStarted","Data":"d2a764f61de7a1ac7630fdeca3dd0fad350af0788392e4ed50acabc3b99fc632"} Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.595961 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"8fb8d392-1263-4049-bb26-f832cc4526e1","Type":"ContainerStarted","Data":"4b117393bbf26385508a23276b09af32eeb2ef7ad7f1f6cf7928a84537a5790d"} Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.595971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"8fb8d392-1263-4049-bb26-f832cc4526e1","Type":"ContainerStarted","Data":"cb06e90e85fb7e2d60a44c189c11a2267b9560f520d8bc91862747df845a851c"} Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.596067 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.598822 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.599339 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" event={"ID":"12e9c803-fc70-41f2-83a2-23e6917fa381","Type":"ContainerDied","Data":"488a585855ba18bc04fc4781c135cdc2abd6e190ad0ef9059dedda6fe7d4f5e1"} Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.599411 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="488a585855ba18bc04fc4781c135cdc2abd6e190ad0ef9059dedda6fe7d4f5e1" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.599628 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.616011 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.390569642 podStartE2EDuration="2.615992439s" podCreationTimestamp="2026-02-26 20:18:03 +0000 UTC" firstStartedPulling="2026-02-26 20:18:05.017739207 +0000 UTC m=+1427.554707131" lastFinishedPulling="2026-02-26 20:18:05.243162004 +0000 UTC m=+1427.780129928" observedRunningTime="2026-02-26 20:18:05.607550002 +0000 UTC m=+1428.144517936" watchObservedRunningTime="2026-02-26 20:18:05.615992439 +0000 UTC m=+1428.152960363" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.649625 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.649605204 podStartE2EDuration="2.649605204s" podCreationTimestamp="2026-02-26 20:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:18:05.635446373 +0000 UTC m=+1428.172414297" watchObservedRunningTime="2026-02-26 20:18:05.649605204 +0000 UTC m=+1428.186573128" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.812518 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xrhct" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" probeResult="failure" output=< Feb 26 20:18:05 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:18:05 crc kubenswrapper[4722]: > Feb 26 20:18:06 crc kubenswrapper[4722]: I0226 20:18:06.203090 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535612-72dkb"] Feb 26 20:18:06 crc kubenswrapper[4722]: I0226 20:18:06.212855 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535612-72dkb"] Feb 26 20:18:08 crc kubenswrapper[4722]: I0226 20:18:08.167759 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="310eccc9-804e-4a2c-ba45-adf425f191ba" path="/var/lib/kubelet/pods/310eccc9-804e-4a2c-ba45-adf425f191ba/volumes" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.250321 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.319438 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78468d7767-275dc"] Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.319856 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78468d7767-275dc" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerName="dnsmasq-dns" containerID="cri-o://91f131a0d385272e4122e2d803aa86a0220bba57e175fca1b464af0e6587a981" gracePeriod=10 Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.455452 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5475ccd585-mvzh4"] Feb 26 20:18:12 crc kubenswrapper[4722]: E0226 20:18:12.455906 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e9c803-fc70-41f2-83a2-23e6917fa381" containerName="oc" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.455922 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e9c803-fc70-41f2-83a2-23e6917fa381" containerName="oc" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.456156 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e9c803-fc70-41f2-83a2-23e6917fa381" containerName="oc" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.457340 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.482106 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5475ccd585-mvzh4"] Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.563655 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-openstack-edpm-ipam\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.563711 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg7jx\" (UniqueName: \"kubernetes.io/projected/3065620c-5bba-4e4f-a622-151e564a3e06-kube-api-access-hg7jx\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.564055 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-ovsdbserver-nb\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.564275 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-dns-swift-storage-0\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.564390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-dns-svc\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.564428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-ovsdbserver-sb\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.564472 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-config\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.668535 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-dns-swift-storage-0\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.668969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-dns-svc\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.668998 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-ovsdbserver-sb\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.669028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-config\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.669091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-openstack-edpm-ipam\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.669129 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg7jx\" (UniqueName: \"kubernetes.io/projected/3065620c-5bba-4e4f-a622-151e564a3e06-kube-api-access-hg7jx\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.669531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-ovsdbserver-nb\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.669969 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-dns-svc\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.671699 4722 generic.go:334] "Generic (PLEG): container finished" podID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerID="91f131a0d385272e4122e2d803aa86a0220bba57e175fca1b464af0e6587a981" exitCode=0 Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.671743 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78468d7767-275dc" event={"ID":"3daa70c7-4339-4dad-8531-4e9772dca52d","Type":"ContainerDied","Data":"91f131a0d385272e4122e2d803aa86a0220bba57e175fca1b464af0e6587a981"} Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.671814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-openstack-edpm-ipam\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.672454 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-config\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.673084 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-dns-swift-storage-0\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.673152 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-ovsdbserver-sb\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.674120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-ovsdbserver-nb\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.692306 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg7jx\" (UniqueName: \"kubernetes.io/projected/3065620c-5bba-4e4f-a622-151e564a3e06-kube-api-access-hg7jx\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.827558 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.004793 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.081971 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-swift-storage-0\") pod \"3daa70c7-4339-4dad-8531-4e9772dca52d\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.082075 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-svc\") pod \"3daa70c7-4339-4dad-8531-4e9772dca52d\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.082264 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btnm2\" (UniqueName: \"kubernetes.io/projected/3daa70c7-4339-4dad-8531-4e9772dca52d-kube-api-access-btnm2\") pod \"3daa70c7-4339-4dad-8531-4e9772dca52d\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.082326 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-sb\") pod \"3daa70c7-4339-4dad-8531-4e9772dca52d\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.082392 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-config\") pod \"3daa70c7-4339-4dad-8531-4e9772dca52d\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.082415 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-nb\") pod \"3daa70c7-4339-4dad-8531-4e9772dca52d\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.109124 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3daa70c7-4339-4dad-8531-4e9772dca52d-kube-api-access-btnm2" (OuterVolumeSpecName: "kube-api-access-btnm2") pod "3daa70c7-4339-4dad-8531-4e9772dca52d" (UID: "3daa70c7-4339-4dad-8531-4e9772dca52d"). InnerVolumeSpecName "kube-api-access-btnm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.159818 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3daa70c7-4339-4dad-8531-4e9772dca52d" (UID: "3daa70c7-4339-4dad-8531-4e9772dca52d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.165310 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3daa70c7-4339-4dad-8531-4e9772dca52d" (UID: "3daa70c7-4339-4dad-8531-4e9772dca52d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.170630 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3daa70c7-4339-4dad-8531-4e9772dca52d" (UID: "3daa70c7-4339-4dad-8531-4e9772dca52d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.181440 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3daa70c7-4339-4dad-8531-4e9772dca52d" (UID: "3daa70c7-4339-4dad-8531-4e9772dca52d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.185095 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btnm2\" (UniqueName: \"kubernetes.io/projected/3daa70c7-4339-4dad-8531-4e9772dca52d-kube-api-access-btnm2\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.185127 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.185151 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.185159 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.185169 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.244412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-config" (OuterVolumeSpecName: "config") pod "3daa70c7-4339-4dad-8531-4e9772dca52d" (UID: "3daa70c7-4339-4dad-8531-4e9772dca52d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.287669 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.318510 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5475ccd585-mvzh4"] Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.645837 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sh4jh"] Feb 26 20:18:13 crc kubenswrapper[4722]: E0226 20:18:13.646366 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerName="init" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.646386 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerName="init" Feb 26 20:18:13 crc kubenswrapper[4722]: E0226 20:18:13.646406 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerName="dnsmasq-dns" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.646412 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerName="dnsmasq-dns" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.646652 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerName="dnsmasq-dns" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.648396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.673818 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh4jh"] Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.701738 4722 generic.go:334] "Generic (PLEG): container finished" podID="3065620c-5bba-4e4f-a622-151e564a3e06" containerID="5f2015438acb524523e3857e08cd5e956b7a6a5fdb467fbabdc4997f26b05bb3" exitCode=0 Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.702001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" event={"ID":"3065620c-5bba-4e4f-a622-151e564a3e06","Type":"ContainerDied","Data":"5f2015438acb524523e3857e08cd5e956b7a6a5fdb467fbabdc4997f26b05bb3"} Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.702092 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" event={"ID":"3065620c-5bba-4e4f-a622-151e564a3e06","Type":"ContainerStarted","Data":"97fd3634b6ece4e5fd0458be77ae54f73682fda825d7f091a9f266ae9facc299"} Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.707975 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78468d7767-275dc" event={"ID":"3daa70c7-4339-4dad-8531-4e9772dca52d","Type":"ContainerDied","Data":"5aa06895449e8178118801bc34ee6a228ece2474fb523cfc5dcb8d816767e6f8"} Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.708035 4722 scope.go:117] "RemoveContainer" containerID="91f131a0d385272e4122e2d803aa86a0220bba57e175fca1b464af0e6587a981" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.708129 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.799015 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-catalog-content\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.799184 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl888\" (UniqueName: \"kubernetes.io/projected/699555aa-918c-47bd-a64f-e228eceeeb78-kube-api-access-bl888\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.799215 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-utilities\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.896318 4722 scope.go:117] "RemoveContainer" containerID="70da35dea19ed1e6b7bc1057598c17e82450bd4aa8e04b6db6ad8e73115c2027" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.900903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-catalog-content\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.900986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl888\" (UniqueName: \"kubernetes.io/projected/699555aa-918c-47bd-a64f-e228eceeeb78-kube-api-access-bl888\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.901009 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-utilities\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.901519 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-catalog-content\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.901561 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-utilities\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.969602 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78468d7767-275dc"] Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.970866 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl888\" (UniqueName: \"kubernetes.io/projected/699555aa-918c-47bd-a64f-e228eceeeb78-kube-api-access-bl888\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.984775 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78468d7767-275dc"] Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.157210 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" path="/var/lib/kubelet/pods/3daa70c7-4339-4dad-8531-4e9772dca52d/volumes" Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.184948 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.622670 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh4jh"] Feb 26 20:18:14 crc kubenswrapper[4722]: W0226 20:18:14.641504 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod699555aa_918c_47bd_a64f_e228eceeeb78.slice/crio-f7f4a780a4b5663059e229318d54c763e360cc188ffa33a9d4f535a926e05e32 WatchSource:0}: Error finding container f7f4a780a4b5663059e229318d54c763e360cc188ffa33a9d4f535a926e05e32: Status 404 returned error can't find the container with id f7f4a780a4b5663059e229318d54c763e360cc188ffa33a9d4f535a926e05e32 Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.731346 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh4jh" event={"ID":"699555aa-918c-47bd-a64f-e228eceeeb78","Type":"ContainerStarted","Data":"f7f4a780a4b5663059e229318d54c763e360cc188ffa33a9d4f535a926e05e32"} Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.742586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" event={"ID":"3065620c-5bba-4e4f-a622-151e564a3e06","Type":"ContainerStarted","Data":"93b967e8688874106d8567624b9adef50c4660d3124dc29a192e1cbfd1ca591c"} Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.742742 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.762499 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" podStartSLOduration=2.762479011 podStartE2EDuration="2.762479011s" podCreationTimestamp="2026-02-26 20:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:18:14.75908738 +0000 UTC m=+1437.296055324" watchObservedRunningTime="2026-02-26 20:18:14.762479011 +0000 UTC m=+1437.299446945" Feb 26 20:18:15 crc kubenswrapper[4722]: I0226 20:18:15.753963 4722 generic.go:334] "Generic (PLEG): container finished" podID="699555aa-918c-47bd-a64f-e228eceeeb78" containerID="3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526" exitCode=0 Feb 26 20:18:15 crc kubenswrapper[4722]: I0226 20:18:15.754022 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh4jh" event={"ID":"699555aa-918c-47bd-a64f-e228eceeeb78","Type":"ContainerDied","Data":"3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526"} Feb 26 20:18:15 crc kubenswrapper[4722]: I0226 20:18:15.820598 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xrhct" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" probeResult="failure" output=< Feb 26 20:18:15 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:18:15 crc kubenswrapper[4722]: > Feb 26 20:18:16 crc kubenswrapper[4722]: I0226 20:18:16.767995 4722 generic.go:334] "Generic (PLEG): container finished" podID="699555aa-918c-47bd-a64f-e228eceeeb78" containerID="3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c" exitCode=0 Feb 26 20:18:16 crc kubenswrapper[4722]: I0226 20:18:16.768067 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh4jh" event={"ID":"699555aa-918c-47bd-a64f-e228eceeeb78","Type":"ContainerDied","Data":"3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c"} Feb 26 20:18:17 crc kubenswrapper[4722]: I0226 20:18:17.780789 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh4jh" event={"ID":"699555aa-918c-47bd-a64f-e228eceeeb78","Type":"ContainerStarted","Data":"eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0"} Feb 26 20:18:17 crc kubenswrapper[4722]: I0226 20:18:17.796898 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sh4jh" podStartSLOduration=3.40232601 podStartE2EDuration="4.796880164s" podCreationTimestamp="2026-02-26 20:18:13 +0000 UTC" firstStartedPulling="2026-02-26 20:18:15.756074995 +0000 UTC m=+1438.293042919" lastFinishedPulling="2026-02-26 20:18:17.150629139 +0000 UTC m=+1439.687597073" observedRunningTime="2026-02-26 20:18:17.795513197 +0000 UTC m=+1440.332481141" watchObservedRunningTime="2026-02-26 20:18:17.796880164 +0000 UTC m=+1440.333848108" Feb 26 20:18:22 crc kubenswrapper[4722]: I0226 20:18:22.829181 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:22 crc kubenswrapper[4722]: I0226 20:18:22.909530 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-595979776c-nrnx7"] Feb 26 20:18:22 crc kubenswrapper[4722]: I0226 20:18:22.909809 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-595979776c-nrnx7" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerName="dnsmasq-dns" containerID="cri-o://9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d" gracePeriod=10 Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.403403 4722 scope.go:117] "RemoveContainer" containerID="7e96ceda765a495699c2b1fe964ea6c48cdbb571cfeda308f4b0e0bb2d151a87" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.520729 4722 scope.go:117] "RemoveContainer" containerID="4b7de2619faa77e1eb7478bfe0b45934e7f95b49993cf881f49177297217f430" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.548668 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.708648 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-openstack-edpm-ipam\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.708987 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-svc\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.709012 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-swift-storage-0\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.709216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-config\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.709302 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-nb\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.709366 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-kube-api-access-2jfm9\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.709415 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-sb\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.718037 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-kube-api-access-2jfm9" (OuterVolumeSpecName: "kube-api-access-2jfm9") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "kube-api-access-2jfm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.767343 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.768298 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.769533 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.777791 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-config" (OuterVolumeSpecName: "config") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.778296 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.779633 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.812621 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.812794 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.812853 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-kube-api-access-2jfm9\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.812905 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.812982 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.813043 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.813097 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.848452 4722 generic.go:334] "Generic (PLEG): container finished" podID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerID="9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d" exitCode=0 Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.848504 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-595979776c-nrnx7" event={"ID":"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda","Type":"ContainerDied","Data":"9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d"} Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.848526 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-595979776c-nrnx7" event={"ID":"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda","Type":"ContainerDied","Data":"f97c987ca915ba748dfc3ebd04fdebcc1fd45eed70119f2de808019def610661"} Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.848545 4722 scope.go:117] "RemoveContainer" containerID="9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.848671 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.876165 4722 scope.go:117] "RemoveContainer" containerID="fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.886871 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-595979776c-nrnx7"] Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.896886 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-595979776c-nrnx7"] Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.904968 4722 scope.go:117] "RemoveContainer" containerID="9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d" Feb 26 20:18:23 crc kubenswrapper[4722]: E0226 20:18:23.905380 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d\": container with ID starting with 9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d not found: ID does not exist" containerID="9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.905420 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d"} err="failed to get container status \"9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d\": rpc error: code = NotFound desc = could not find container \"9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d\": container with ID starting with 9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d not found: ID does not exist" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.905448 4722 scope.go:117] "RemoveContainer" containerID="fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f" Feb 26 20:18:23 crc kubenswrapper[4722]: E0226 20:18:23.905822 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f\": container with ID starting with fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f not found: ID does not exist" containerID="fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.905862 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f"} err="failed to get container status \"fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f\": rpc error: code = NotFound desc = could not find container \"fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f\": container with ID starting with fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f not found: ID does not exist" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.156909 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" path="/var/lib/kubelet/pods/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda/volumes" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.185369 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.185421 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.238131 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.818991 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.870010 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.921631 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:25 crc kubenswrapper[4722]: I0226 20:18:25.703250 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 20:18:25 crc kubenswrapper[4722]: I0226 20:18:25.873251 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrhct"] Feb 26 20:18:25 crc kubenswrapper[4722]: I0226 20:18:25.874158 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xrhct" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" containerID="cri-o://b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3" gracePeriod=2 Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.446829 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.566505 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-catalog-content\") pod \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.566583 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-utilities\") pod \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.566754 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjjl2\" (UniqueName: \"kubernetes.io/projected/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-kube-api-access-kjjl2\") pod \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.567697 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-utilities" (OuterVolumeSpecName: "utilities") pod "a6e86a70-aac2-4233-bd15-0dd2a1e17d21" (UID: "a6e86a70-aac2-4233-bd15-0dd2a1e17d21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.574444 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-kube-api-access-kjjl2" (OuterVolumeSpecName: "kube-api-access-kjjl2") pod "a6e86a70-aac2-4233-bd15-0dd2a1e17d21" (UID: "a6e86a70-aac2-4233-bd15-0dd2a1e17d21"). InnerVolumeSpecName "kube-api-access-kjjl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.670161 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.670465 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjjl2\" (UniqueName: \"kubernetes.io/projected/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-kube-api-access-kjjl2\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.709739 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6e86a70-aac2-4233-bd15-0dd2a1e17d21" (UID: "a6e86a70-aac2-4233-bd15-0dd2a1e17d21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.772795 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.895048 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerID="b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3" exitCode=0 Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.895104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerDied","Data":"b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3"} Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.895166 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerDied","Data":"960b049f736f2c9e4ef853fc0dd34e254cf8f6e05b88d12195506b153678bbf5"} Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.895189 4722 scope.go:117] "RemoveContainer" containerID="b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.895205 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.958516 4722 scope.go:117] "RemoveContainer" containerID="407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.970201 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrhct"] Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.985388 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xrhct"] Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.993449 4722 scope.go:117] "RemoveContainer" containerID="eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.036293 4722 scope.go:117] "RemoveContainer" containerID="b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3" Feb 26 20:18:27 crc kubenswrapper[4722]: E0226 20:18:27.037029 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3\": container with ID starting with b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3 not found: ID does not exist" containerID="b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.037114 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3"} err="failed to get container status \"b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3\": rpc error: code = NotFound desc = could not find container \"b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3\": container with ID starting with b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3 not found: ID does not exist" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.037190 4722 scope.go:117] "RemoveContainer" containerID="407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020" Feb 26 20:18:27 crc kubenswrapper[4722]: E0226 20:18:27.037718 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020\": container with ID starting with 407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020 not found: ID does not exist" containerID="407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.037781 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020"} err="failed to get container status \"407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020\": rpc error: code = NotFound desc = could not find container \"407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020\": container with ID starting with 407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020 not found: ID does not exist" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.037829 4722 scope.go:117] "RemoveContainer" containerID="eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef" Feb 26 20:18:27 crc kubenswrapper[4722]: E0226 20:18:27.038216 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef\": container with ID starting with eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef not found: ID does not exist" containerID="eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.038270 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef"} err="failed to get container status \"eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef\": rpc error: code = NotFound desc = could not find container \"eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef\": container with ID starting with eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef not found: ID does not exist" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.276655 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh4jh"] Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.276925 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sh4jh" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="registry-server" containerID="cri-o://eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0" gracePeriod=2 Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.817743 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.898565 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-utilities\") pod \"699555aa-918c-47bd-a64f-e228eceeeb78\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.898654 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-catalog-content\") pod \"699555aa-918c-47bd-a64f-e228eceeeb78\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.898764 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl888\" (UniqueName: \"kubernetes.io/projected/699555aa-918c-47bd-a64f-e228eceeeb78-kube-api-access-bl888\") pod \"699555aa-918c-47bd-a64f-e228eceeeb78\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.899479 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-utilities" (OuterVolumeSpecName: "utilities") pod "699555aa-918c-47bd-a64f-e228eceeeb78" (UID: "699555aa-918c-47bd-a64f-e228eceeeb78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.899721 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.905481 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699555aa-918c-47bd-a64f-e228eceeeb78-kube-api-access-bl888" (OuterVolumeSpecName: "kube-api-access-bl888") pod "699555aa-918c-47bd-a64f-e228eceeeb78" (UID: "699555aa-918c-47bd-a64f-e228eceeeb78"). InnerVolumeSpecName "kube-api-access-bl888". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.910652 4722 generic.go:334] "Generic (PLEG): container finished" podID="699555aa-918c-47bd-a64f-e228eceeeb78" containerID="eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0" exitCode=0 Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.910718 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh4jh" event={"ID":"699555aa-918c-47bd-a64f-e228eceeeb78","Type":"ContainerDied","Data":"eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0"} Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.910748 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh4jh" event={"ID":"699555aa-918c-47bd-a64f-e228eceeeb78","Type":"ContainerDied","Data":"f7f4a780a4b5663059e229318d54c763e360cc188ffa33a9d4f535a926e05e32"} Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.910769 4722 scope.go:117] "RemoveContainer" containerID="eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.910892 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.924164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "699555aa-918c-47bd-a64f-e228eceeeb78" (UID: "699555aa-918c-47bd-a64f-e228eceeeb78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.963255 4722 scope.go:117] "RemoveContainer" containerID="3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.994254 4722 scope.go:117] "RemoveContainer" containerID="3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.001422 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.001449 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl888\" (UniqueName: \"kubernetes.io/projected/699555aa-918c-47bd-a64f-e228eceeeb78-kube-api-access-bl888\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.018495 4722 scope.go:117] "RemoveContainer" containerID="eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0" Feb 26 20:18:28 crc kubenswrapper[4722]: E0226 20:18:28.018962 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0\": container with ID starting with eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0 not found: ID does not exist" containerID="eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.019018 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0"} err="failed to get container status \"eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0\": rpc error: code = NotFound desc = could not find container \"eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0\": container with ID starting with eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0 not found: ID does not exist" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.019054 4722 scope.go:117] "RemoveContainer" containerID="3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c" Feb 26 20:18:28 crc kubenswrapper[4722]: E0226 20:18:28.019661 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c\": container with ID starting with 3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c not found: ID does not exist" containerID="3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.019693 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c"} err="failed to get container status \"3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c\": rpc error: code = NotFound desc = could not find container \"3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c\": container with ID starting with 3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c not found: ID does not exist" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.019715 4722 scope.go:117] "RemoveContainer" containerID="3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526" Feb 26 20:18:28 crc kubenswrapper[4722]: E0226 20:18:28.019996 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526\": container with ID starting with 3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526 not found: ID does not exist" containerID="3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.020021 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526"} err="failed to get container status \"3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526\": rpc error: code = NotFound desc = could not find container \"3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526\": container with ID starting with 3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526 not found: ID does not exist" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.160323 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" path="/var/lib/kubelet/pods/a6e86a70-aac2-4233-bd15-0dd2a1e17d21/volumes" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.240857 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh4jh"] Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.253670 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh4jh"] Feb 26 20:18:30 crc kubenswrapper[4722]: I0226 20:18:30.165907 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" path="/var/lib/kubelet/pods/699555aa-918c-47bd-a64f-e228eceeeb78/volumes" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.833222 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq"] Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834318 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="extract-content" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834336 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="extract-content" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834350 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834356 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834383 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerName="dnsmasq-dns" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834392 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerName="dnsmasq-dns" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834410 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="extract-utilities" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834417 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="extract-utilities" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834436 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="registry-server" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834443 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="registry-server" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834462 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="extract-utilities" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834470 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="extract-utilities" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834477 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerName="init" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834485 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerName="init" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834493 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="extract-content" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834500 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="extract-content" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834731 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerName="dnsmasq-dns" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834755 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="registry-server" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834772 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.835721 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.840876 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.841108 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.840988 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.841054 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.860733 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq"] Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.876357 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.876412 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.876973 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9bws\" (UniqueName: \"kubernetes.io/projected/a1a3db58-368f-4ea3-a807-ddd7c58435f5-kube-api-access-d9bws\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.877116 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.978711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.978764 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.978987 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9bws\" (UniqueName: \"kubernetes.io/projected/a1a3db58-368f-4ea3-a807-ddd7c58435f5-kube-api-access-d9bws\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.979050 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.984442 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.991865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.992111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.996106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9bws\" (UniqueName: \"kubernetes.io/projected/a1a3db58-368f-4ea3-a807-ddd7c58435f5-kube-api-access-d9bws\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:36 crc kubenswrapper[4722]: I0226 20:18:36.166460 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:36 crc kubenswrapper[4722]: I0226 20:18:36.782643 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq"] Feb 26 20:18:37 crc kubenswrapper[4722]: I0226 20:18:37.032067 4722 generic.go:334] "Generic (PLEG): container finished" podID="796c5930-3ba4-4795-88f0-2e85145f3c85" containerID="d862b3d0b6db6d8fa6fba2930f0b699cba18d261cae7f637906794821f02217a" exitCode=0 Feb 26 20:18:37 crc kubenswrapper[4722]: I0226 20:18:37.032165 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"796c5930-3ba4-4795-88f0-2e85145f3c85","Type":"ContainerDied","Data":"d862b3d0b6db6d8fa6fba2930f0b699cba18d261cae7f637906794821f02217a"} Feb 26 20:18:37 crc kubenswrapper[4722]: I0226 20:18:37.034172 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" event={"ID":"a1a3db58-368f-4ea3-a807-ddd7c58435f5","Type":"ContainerStarted","Data":"1b7b756a3df3adaab9643d640ad1d1d9c63650e0a2105078ce493d902daa55b7"} Feb 26 20:18:37 crc kubenswrapper[4722]: I0226 20:18:37.036596 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3bb51c2-ceca-4301-82cb-959028030d58" containerID="b7d463edaca0feb6f8edb087f6ca6812811dc253ae02d467612f89ce80906ad7" exitCode=0 Feb 26 20:18:37 crc kubenswrapper[4722]: I0226 20:18:37.036657 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3bb51c2-ceca-4301-82cb-959028030d58","Type":"ContainerDied","Data":"b7d463edaca0feb6f8edb087f6ca6812811dc253ae02d467612f89ce80906ad7"} Feb 26 20:18:38 crc kubenswrapper[4722]: I0226 20:18:38.054722 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"796c5930-3ba4-4795-88f0-2e85145f3c85","Type":"ContainerStarted","Data":"148ce3e4c870073cff67b48477327b2618e85948146a37c2500ffd2d94956c14"} Feb 26 20:18:38 crc kubenswrapper[4722]: I0226 20:18:38.055612 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 26 20:18:38 crc kubenswrapper[4722]: I0226 20:18:38.057066 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3bb51c2-ceca-4301-82cb-959028030d58","Type":"ContainerStarted","Data":"581d5ff09912fcd536b697bc47a92c4f0a0ccfdab138577ddb7d2be7ef4f9c76"} Feb 26 20:18:38 crc kubenswrapper[4722]: I0226 20:18:38.057436 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:38 crc kubenswrapper[4722]: I0226 20:18:38.101405 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.10138811 podStartE2EDuration="37.10138811s" podCreationTimestamp="2026-02-26 20:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:18:38.088489233 +0000 UTC m=+1460.625457157" watchObservedRunningTime="2026-02-26 20:18:38.10138811 +0000 UTC m=+1460.638356034" Feb 26 20:18:38 crc kubenswrapper[4722]: I0226 20:18:38.125253 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.125233552 podStartE2EDuration="37.125233552s" podCreationTimestamp="2026-02-26 20:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:18:38.116865647 +0000 UTC m=+1460.653833571" watchObservedRunningTime="2026-02-26 20:18:38.125233552 +0000 UTC m=+1460.662201476" Feb 26 20:18:41 crc kubenswrapper[4722]: I0226 20:18:41.543808 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 26 20:18:47 crc kubenswrapper[4722]: I0226 20:18:47.165782 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" event={"ID":"a1a3db58-368f-4ea3-a807-ddd7c58435f5","Type":"ContainerStarted","Data":"da287269ba603310f9101a702d9b072cda736283d89aa34890bab941f5e083a8"} Feb 26 20:18:47 crc kubenswrapper[4722]: I0226 20:18:47.187597 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" podStartSLOduration=3.02633452 podStartE2EDuration="12.187562839s" podCreationTimestamp="2026-02-26 20:18:35 +0000 UTC" firstStartedPulling="2026-02-26 20:18:36.783862138 +0000 UTC m=+1459.320830062" lastFinishedPulling="2026-02-26 20:18:45.945090447 +0000 UTC m=+1468.482058381" observedRunningTime="2026-02-26 20:18:47.186100479 +0000 UTC m=+1469.723068413" watchObservedRunningTime="2026-02-26 20:18:47.187562839 +0000 UTC m=+1469.724530843" Feb 26 20:18:51 crc kubenswrapper[4722]: I0226 20:18:51.963300 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 26 20:18:52 crc kubenswrapper[4722]: I0226 20:18:52.092243 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:57 crc kubenswrapper[4722]: I0226 20:18:57.278233 4722 generic.go:334] "Generic (PLEG): container finished" podID="a1a3db58-368f-4ea3-a807-ddd7c58435f5" containerID="da287269ba603310f9101a702d9b072cda736283d89aa34890bab941f5e083a8" exitCode=0 Feb 26 20:18:57 crc kubenswrapper[4722]: I0226 20:18:57.278346 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" event={"ID":"a1a3db58-368f-4ea3-a807-ddd7c58435f5","Type":"ContainerDied","Data":"da287269ba603310f9101a702d9b072cda736283d89aa34890bab941f5e083a8"} Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.831408 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.942680 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-inventory\") pod \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.942937 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9bws\" (UniqueName: \"kubernetes.io/projected/a1a3db58-368f-4ea3-a807-ddd7c58435f5-kube-api-access-d9bws\") pod \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.943036 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-repo-setup-combined-ca-bundle\") pod \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.943106 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-ssh-key-openstack-edpm-ipam\") pod \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.950984 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1a3db58-368f-4ea3-a807-ddd7c58435f5-kube-api-access-d9bws" (OuterVolumeSpecName: "kube-api-access-d9bws") pod "a1a3db58-368f-4ea3-a807-ddd7c58435f5" (UID: "a1a3db58-368f-4ea3-a807-ddd7c58435f5"). InnerVolumeSpecName "kube-api-access-d9bws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.951448 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a1a3db58-368f-4ea3-a807-ddd7c58435f5" (UID: "a1a3db58-368f-4ea3-a807-ddd7c58435f5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.976067 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a1a3db58-368f-4ea3-a807-ddd7c58435f5" (UID: "a1a3db58-368f-4ea3-a807-ddd7c58435f5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.979514 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-inventory" (OuterVolumeSpecName: "inventory") pod "a1a3db58-368f-4ea3-a807-ddd7c58435f5" (UID: "a1a3db58-368f-4ea3-a807-ddd7c58435f5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.045834 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9bws\" (UniqueName: \"kubernetes.io/projected/a1a3db58-368f-4ea3-a807-ddd7c58435f5-kube-api-access-d9bws\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.046120 4722 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.046207 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.046276 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.299975 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" event={"ID":"a1a3db58-368f-4ea3-a807-ddd7c58435f5","Type":"ContainerDied","Data":"1b7b756a3df3adaab9643d640ad1d1d9c63650e0a2105078ce493d902daa55b7"} Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.300427 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b7b756a3df3adaab9643d640ad1d1d9c63650e0a2105078ce493d902daa55b7" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.300062 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.394118 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh"] Feb 26 20:18:59 crc kubenswrapper[4722]: E0226 20:18:59.394730 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a3db58-368f-4ea3-a807-ddd7c58435f5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.394754 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a3db58-368f-4ea3-a807-ddd7c58435f5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.395002 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a3db58-368f-4ea3-a807-ddd7c58435f5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.396762 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.400101 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.400264 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.400340 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.400486 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.407061 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh"] Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.455211 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.455340 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbxxj\" (UniqueName: \"kubernetes.io/projected/5a0a077a-aebd-490b-b110-bc7927910d4a-kube-api-access-rbxxj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.455373 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.557857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.557958 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbxxj\" (UniqueName: \"kubernetes.io/projected/5a0a077a-aebd-490b-b110-bc7927910d4a-kube-api-access-rbxxj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.557988 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.578304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.579720 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.582971 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbxxj\" (UniqueName: \"kubernetes.io/projected/5a0a077a-aebd-490b-b110-bc7927910d4a-kube-api-access-rbxxj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.727241 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:19:00 crc kubenswrapper[4722]: W0226 20:19:00.335906 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a0a077a_aebd_490b_b110_bc7927910d4a.slice/crio-10257b0a1f387251f1d4bd33880c6b5b7873d133603cac750912acd2d5803be8 WatchSource:0}: Error finding container 10257b0a1f387251f1d4bd33880c6b5b7873d133603cac750912acd2d5803be8: Status 404 returned error can't find the container with id 10257b0a1f387251f1d4bd33880c6b5b7873d133603cac750912acd2d5803be8 Feb 26 20:19:00 crc kubenswrapper[4722]: I0226 20:19:00.342430 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh"] Feb 26 20:19:01 crc kubenswrapper[4722]: I0226 20:19:01.328055 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" event={"ID":"5a0a077a-aebd-490b-b110-bc7927910d4a","Type":"ContainerStarted","Data":"812f9558e7737f95e3fb233c591e053de8b0f47404018490ffbddd773c458f99"} Feb 26 20:19:01 crc kubenswrapper[4722]: I0226 20:19:01.328995 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" event={"ID":"5a0a077a-aebd-490b-b110-bc7927910d4a","Type":"ContainerStarted","Data":"10257b0a1f387251f1d4bd33880c6b5b7873d133603cac750912acd2d5803be8"} Feb 26 20:19:01 crc kubenswrapper[4722]: I0226 20:19:01.357384 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" podStartSLOduration=1.8735947849999999 podStartE2EDuration="2.357361406s" podCreationTimestamp="2026-02-26 20:18:59 +0000 UTC" firstStartedPulling="2026-02-26 20:19:00.338884613 +0000 UTC m=+1482.875852537" lastFinishedPulling="2026-02-26 20:19:00.822651234 +0000 UTC m=+1483.359619158" observedRunningTime="2026-02-26 20:19:01.349423513 +0000 UTC m=+1483.886391537" watchObservedRunningTime="2026-02-26 20:19:01.357361406 +0000 UTC m=+1483.894329350" Feb 26 20:19:04 crc kubenswrapper[4722]: I0226 20:19:04.377749 4722 generic.go:334] "Generic (PLEG): container finished" podID="5a0a077a-aebd-490b-b110-bc7927910d4a" containerID="812f9558e7737f95e3fb233c591e053de8b0f47404018490ffbddd773c458f99" exitCode=0 Feb 26 20:19:04 crc kubenswrapper[4722]: I0226 20:19:04.377877 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" event={"ID":"5a0a077a-aebd-490b-b110-bc7927910d4a","Type":"ContainerDied","Data":"812f9558e7737f95e3fb233c591e053de8b0f47404018490ffbddd773c458f99"} Feb 26 20:19:05 crc kubenswrapper[4722]: I0226 20:19:05.940425 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.021008 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-ssh-key-openstack-edpm-ipam\") pod \"5a0a077a-aebd-490b-b110-bc7927910d4a\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.021251 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-inventory\") pod \"5a0a077a-aebd-490b-b110-bc7927910d4a\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.021384 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbxxj\" (UniqueName: \"kubernetes.io/projected/5a0a077a-aebd-490b-b110-bc7927910d4a-kube-api-access-rbxxj\") pod \"5a0a077a-aebd-490b-b110-bc7927910d4a\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.026830 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a0a077a-aebd-490b-b110-bc7927910d4a-kube-api-access-rbxxj" (OuterVolumeSpecName: "kube-api-access-rbxxj") pod "5a0a077a-aebd-490b-b110-bc7927910d4a" (UID: "5a0a077a-aebd-490b-b110-bc7927910d4a"). InnerVolumeSpecName "kube-api-access-rbxxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.055658 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5a0a077a-aebd-490b-b110-bc7927910d4a" (UID: "5a0a077a-aebd-490b-b110-bc7927910d4a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.062519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-inventory" (OuterVolumeSpecName: "inventory") pod "5a0a077a-aebd-490b-b110-bc7927910d4a" (UID: "5a0a077a-aebd-490b-b110-bc7927910d4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.124329 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.124357 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbxxj\" (UniqueName: \"kubernetes.io/projected/5a0a077a-aebd-490b-b110-bc7927910d4a-kube-api-access-rbxxj\") on node \"crc\" DevicePath \"\"" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.124371 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.404147 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" event={"ID":"5a0a077a-aebd-490b-b110-bc7927910d4a","Type":"ContainerDied","Data":"10257b0a1f387251f1d4bd33880c6b5b7873d133603cac750912acd2d5803be8"} Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.404195 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10257b0a1f387251f1d4bd33880c6b5b7873d133603cac750912acd2d5803be8" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.404237 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.477104 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx"] Feb 26 20:19:06 crc kubenswrapper[4722]: E0226 20:19:06.477816 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0a077a-aebd-490b-b110-bc7927910d4a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.477847 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0a077a-aebd-490b-b110-bc7927910d4a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.478419 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a0a077a-aebd-490b-b110-bc7927910d4a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.479576 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.481765 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.482285 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.482449 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.482775 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.494649 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx"] Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.533614 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.533686 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.533765 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.533908 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plkz9\" (UniqueName: \"kubernetes.io/projected/7aea65fe-4b22-44f8-b756-2ee54c916c8a-kube-api-access-plkz9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.635239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plkz9\" (UniqueName: \"kubernetes.io/projected/7aea65fe-4b22-44f8-b756-2ee54c916c8a-kube-api-access-plkz9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.635352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.635406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.635460 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.639569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.639594 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.640738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.651413 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plkz9\" (UniqueName: \"kubernetes.io/projected/7aea65fe-4b22-44f8-b756-2ee54c916c8a-kube-api-access-plkz9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.813563 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:07 crc kubenswrapper[4722]: I0226 20:19:07.405287 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx"] Feb 26 20:19:07 crc kubenswrapper[4722]: W0226 20:19:07.411819 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aea65fe_4b22_44f8_b756_2ee54c916c8a.slice/crio-95b6b4d34df6c168cfba7dd94de0b62fa7dcdcffd63be8b9be88180eaeda8b7d WatchSource:0}: Error finding container 95b6b4d34df6c168cfba7dd94de0b62fa7dcdcffd63be8b9be88180eaeda8b7d: Status 404 returned error can't find the container with id 95b6b4d34df6c168cfba7dd94de0b62fa7dcdcffd63be8b9be88180eaeda8b7d Feb 26 20:19:08 crc kubenswrapper[4722]: I0226 20:19:08.429405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" event={"ID":"7aea65fe-4b22-44f8-b756-2ee54c916c8a","Type":"ContainerStarted","Data":"2e665c0b7dcef25f73d8804548acdfb65f3b5d949af179abfee81b7436428b50"} Feb 26 20:19:08 crc kubenswrapper[4722]: I0226 20:19:08.429950 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" event={"ID":"7aea65fe-4b22-44f8-b756-2ee54c916c8a","Type":"ContainerStarted","Data":"95b6b4d34df6c168cfba7dd94de0b62fa7dcdcffd63be8b9be88180eaeda8b7d"} Feb 26 20:19:08 crc kubenswrapper[4722]: I0226 20:19:08.445303 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" podStartSLOduration=1.9555571600000001 podStartE2EDuration="2.445255851s" podCreationTimestamp="2026-02-26 20:19:06 +0000 UTC" firstStartedPulling="2026-02-26 20:19:07.415289638 +0000 UTC m=+1489.952257572" lastFinishedPulling="2026-02-26 20:19:07.904988339 +0000 UTC m=+1490.441956263" observedRunningTime="2026-02-26 20:19:08.44375759 +0000 UTC m=+1490.980725514" watchObservedRunningTime="2026-02-26 20:19:08.445255851 +0000 UTC m=+1490.982223775" Feb 26 20:19:23 crc kubenswrapper[4722]: I0226 20:19:23.487283 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:19:23 crc kubenswrapper[4722]: I0226 20:19:23.487877 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:19:23 crc kubenswrapper[4722]: I0226 20:19:23.720750 4722 scope.go:117] "RemoveContainer" containerID="fc1411365ef68c7f885a718434523637bdd447d960eca4fac57d8d2753da939b" Feb 26 20:19:23 crc kubenswrapper[4722]: I0226 20:19:23.773307 4722 scope.go:117] "RemoveContainer" containerID="84e5e27436da9beaab179cd560661b36041bc65052b69452c49dbc3b66f3802b" Feb 26 20:19:23 crc kubenswrapper[4722]: I0226 20:19:23.816114 4722 scope.go:117] "RemoveContainer" containerID="df270729411ef9e7833235443490c726efde57815635ab30de2a17139899505d" Feb 26 20:19:53 crc kubenswrapper[4722]: I0226 20:19:53.486995 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:19:53 crc kubenswrapper[4722]: I0226 20:19:53.487549 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.142021 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535620-cgl4r"] Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.145175 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.149088 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.149302 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.149675 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.160984 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535620-cgl4r"] Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.280095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwnx\" (UniqueName: \"kubernetes.io/projected/10c709bd-8242-4d15-b343-b6e07c3cb44c-kube-api-access-npwnx\") pod \"auto-csr-approver-29535620-cgl4r\" (UID: \"10c709bd-8242-4d15-b343-b6e07c3cb44c\") " pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.381761 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npwnx\" (UniqueName: \"kubernetes.io/projected/10c709bd-8242-4d15-b343-b6e07c3cb44c-kube-api-access-npwnx\") pod \"auto-csr-approver-29535620-cgl4r\" (UID: \"10c709bd-8242-4d15-b343-b6e07c3cb44c\") " pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.403568 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwnx\" (UniqueName: \"kubernetes.io/projected/10c709bd-8242-4d15-b343-b6e07c3cb44c-kube-api-access-npwnx\") pod \"auto-csr-approver-29535620-cgl4r\" (UID: \"10c709bd-8242-4d15-b343-b6e07c3cb44c\") " pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.466528 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.965131 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.967300 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535620-cgl4r"] Feb 26 20:20:01 crc kubenswrapper[4722]: I0226 20:20:01.035292 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" event={"ID":"10c709bd-8242-4d15-b343-b6e07c3cb44c","Type":"ContainerStarted","Data":"c697e8cd2e8af6c6b111f3cae66dbf74a7f2a00e425721cb9a708a5a0cb233c7"} Feb 26 20:20:03 crc kubenswrapper[4722]: I0226 20:20:03.062213 4722 generic.go:334] "Generic (PLEG): container finished" podID="10c709bd-8242-4d15-b343-b6e07c3cb44c" containerID="a40c11587c55ff87865da5c5fd2011c57738196a56ea15331c61f9c3ecb1e29d" exitCode=0 Feb 26 20:20:03 crc kubenswrapper[4722]: I0226 20:20:03.062283 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" event={"ID":"10c709bd-8242-4d15-b343-b6e07c3cb44c","Type":"ContainerDied","Data":"a40c11587c55ff87865da5c5fd2011c57738196a56ea15331c61f9c3ecb1e29d"} Feb 26 20:20:04 crc kubenswrapper[4722]: I0226 20:20:04.516790 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:04 crc kubenswrapper[4722]: I0226 20:20:04.684816 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npwnx\" (UniqueName: \"kubernetes.io/projected/10c709bd-8242-4d15-b343-b6e07c3cb44c-kube-api-access-npwnx\") pod \"10c709bd-8242-4d15-b343-b6e07c3cb44c\" (UID: \"10c709bd-8242-4d15-b343-b6e07c3cb44c\") " Feb 26 20:20:04 crc kubenswrapper[4722]: I0226 20:20:04.693977 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c709bd-8242-4d15-b343-b6e07c3cb44c-kube-api-access-npwnx" (OuterVolumeSpecName: "kube-api-access-npwnx") pod "10c709bd-8242-4d15-b343-b6e07c3cb44c" (UID: "10c709bd-8242-4d15-b343-b6e07c3cb44c"). InnerVolumeSpecName "kube-api-access-npwnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:20:04 crc kubenswrapper[4722]: I0226 20:20:04.787685 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npwnx\" (UniqueName: \"kubernetes.io/projected/10c709bd-8242-4d15-b343-b6e07c3cb44c-kube-api-access-npwnx\") on node \"crc\" DevicePath \"\"" Feb 26 20:20:05 crc kubenswrapper[4722]: I0226 20:20:05.096747 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" event={"ID":"10c709bd-8242-4d15-b343-b6e07c3cb44c","Type":"ContainerDied","Data":"c697e8cd2e8af6c6b111f3cae66dbf74a7f2a00e425721cb9a708a5a0cb233c7"} Feb 26 20:20:05 crc kubenswrapper[4722]: I0226 20:20:05.096786 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c697e8cd2e8af6c6b111f3cae66dbf74a7f2a00e425721cb9a708a5a0cb233c7" Feb 26 20:20:05 crc kubenswrapper[4722]: I0226 20:20:05.096815 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:05 crc kubenswrapper[4722]: I0226 20:20:05.611765 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535614-l66lm"] Feb 26 20:20:05 crc kubenswrapper[4722]: I0226 20:20:05.624272 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535614-l66lm"] Feb 26 20:20:06 crc kubenswrapper[4722]: I0226 20:20:06.166667 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81e036d-5879-4813-bfda-9a203246b1e3" path="/var/lib/kubelet/pods/a81e036d-5879-4813-bfda-9a203246b1e3/volumes" Feb 26 20:20:23 crc kubenswrapper[4722]: I0226 20:20:23.487189 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:20:23 crc kubenswrapper[4722]: I0226 20:20:23.487680 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:20:23 crc kubenswrapper[4722]: I0226 20:20:23.487719 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:20:23 crc kubenswrapper[4722]: I0226 20:20:23.488241 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:20:23 crc kubenswrapper[4722]: I0226 20:20:23.488288 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" gracePeriod=600 Feb 26 20:20:23 crc kubenswrapper[4722]: E0226 20:20:23.609746 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:20:23 crc kubenswrapper[4722]: I0226 20:20:23.962974 4722 scope.go:117] "RemoveContainer" containerID="5c403af37ceeca345c4731b4b5131a0a804ef482ec06690bdc4bee17d0817b04" Feb 26 20:20:24 crc kubenswrapper[4722]: I0226 20:20:24.033791 4722 scope.go:117] "RemoveContainer" containerID="81dc8ce724e1025b2d25fe14f2b9bb694db4be3db85ce12a895a7e230ea03925" Feb 26 20:20:24 crc kubenswrapper[4722]: I0226 20:20:24.325541 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" exitCode=0 Feb 26 20:20:24 crc kubenswrapper[4722]: I0226 20:20:24.325590 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188"} Feb 26 20:20:24 crc kubenswrapper[4722]: I0226 20:20:24.325635 4722 scope.go:117] "RemoveContainer" containerID="c6d778fad2f2151e0aabde662094a8e54f4922234ea2496f6de56c2b4fb7262f" Feb 26 20:20:24 crc kubenswrapper[4722]: I0226 20:20:24.326592 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:20:24 crc kubenswrapper[4722]: E0226 20:20:24.327023 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:20:38 crc kubenswrapper[4722]: I0226 20:20:38.146321 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:20:38 crc kubenswrapper[4722]: E0226 20:20:38.147031 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.630959 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gjpnv"] Feb 26 20:20:48 crc kubenswrapper[4722]: E0226 20:20:48.632179 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c709bd-8242-4d15-b343-b6e07c3cb44c" containerName="oc" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.632198 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c709bd-8242-4d15-b343-b6e07c3cb44c" containerName="oc" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.632428 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c709bd-8242-4d15-b343-b6e07c3cb44c" containerName="oc" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.634361 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.646550 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gjpnv"] Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.719153 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79ngm\" (UniqueName: \"kubernetes.io/projected/45b84e9a-bab4-4169-a299-d5133d490692-kube-api-access-79ngm\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.719238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-catalog-content\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.719364 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-utilities\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.821565 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-utilities\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.821727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79ngm\" (UniqueName: \"kubernetes.io/projected/45b84e9a-bab4-4169-a299-d5133d490692-kube-api-access-79ngm\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.821763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-catalog-content\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.822115 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-utilities\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.822177 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-catalog-content\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.842859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79ngm\" (UniqueName: \"kubernetes.io/projected/45b84e9a-bab4-4169-a299-d5133d490692-kube-api-access-79ngm\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.962115 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:49 crc kubenswrapper[4722]: I0226 20:20:49.424898 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gjpnv"] Feb 26 20:20:49 crc kubenswrapper[4722]: I0226 20:20:49.591413 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerStarted","Data":"4c789c4a7e2ff59a7b9107b416dc9d6defa0e80bd77a26652a24c8518e46ceab"} Feb 26 20:20:50 crc kubenswrapper[4722]: I0226 20:20:50.604168 4722 generic.go:334] "Generic (PLEG): container finished" podID="45b84e9a-bab4-4169-a299-d5133d490692" containerID="114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3" exitCode=0 Feb 26 20:20:50 crc kubenswrapper[4722]: I0226 20:20:50.604287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerDied","Data":"114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3"} Feb 26 20:20:52 crc kubenswrapper[4722]: I0226 20:20:52.146570 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:20:52 crc kubenswrapper[4722]: E0226 20:20:52.147071 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:20:52 crc kubenswrapper[4722]: I0226 20:20:52.623577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerStarted","Data":"1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36"} Feb 26 20:20:53 crc kubenswrapper[4722]: I0226 20:20:53.634551 4722 generic.go:334] "Generic (PLEG): container finished" podID="45b84e9a-bab4-4169-a299-d5133d490692" containerID="1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36" exitCode=0 Feb 26 20:20:53 crc kubenswrapper[4722]: I0226 20:20:53.634600 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerDied","Data":"1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36"} Feb 26 20:20:54 crc kubenswrapper[4722]: I0226 20:20:54.645362 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerStarted","Data":"7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb"} Feb 26 20:20:54 crc kubenswrapper[4722]: I0226 20:20:54.665987 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gjpnv" podStartSLOduration=3.265986556 podStartE2EDuration="6.665966674s" podCreationTimestamp="2026-02-26 20:20:48 +0000 UTC" firstStartedPulling="2026-02-26 20:20:50.606475709 +0000 UTC m=+1593.143443633" lastFinishedPulling="2026-02-26 20:20:54.006455827 +0000 UTC m=+1596.543423751" observedRunningTime="2026-02-26 20:20:54.663522149 +0000 UTC m=+1597.200490093" watchObservedRunningTime="2026-02-26 20:20:54.665966674 +0000 UTC m=+1597.202934598" Feb 26 20:20:58 crc kubenswrapper[4722]: I0226 20:20:58.962873 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:58 crc kubenswrapper[4722]: I0226 20:20:58.963554 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:59 crc kubenswrapper[4722]: I0226 20:20:59.045878 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:59 crc kubenswrapper[4722]: I0226 20:20:59.778284 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:59 crc kubenswrapper[4722]: I0226 20:20:59.843244 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gjpnv"] Feb 26 20:21:01 crc kubenswrapper[4722]: I0226 20:21:01.726242 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gjpnv" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="registry-server" containerID="cri-o://7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb" gracePeriod=2 Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.308113 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.404341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79ngm\" (UniqueName: \"kubernetes.io/projected/45b84e9a-bab4-4169-a299-d5133d490692-kube-api-access-79ngm\") pod \"45b84e9a-bab4-4169-a299-d5133d490692\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.404599 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-utilities\") pod \"45b84e9a-bab4-4169-a299-d5133d490692\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.404664 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-catalog-content\") pod \"45b84e9a-bab4-4169-a299-d5133d490692\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.405496 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-utilities" (OuterVolumeSpecName: "utilities") pod "45b84e9a-bab4-4169-a299-d5133d490692" (UID: "45b84e9a-bab4-4169-a299-d5133d490692"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.415866 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b84e9a-bab4-4169-a299-d5133d490692-kube-api-access-79ngm" (OuterVolumeSpecName: "kube-api-access-79ngm") pod "45b84e9a-bab4-4169-a299-d5133d490692" (UID: "45b84e9a-bab4-4169-a299-d5133d490692"). InnerVolumeSpecName "kube-api-access-79ngm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.456625 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45b84e9a-bab4-4169-a299-d5133d490692" (UID: "45b84e9a-bab4-4169-a299-d5133d490692"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.507441 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.507473 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.507483 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79ngm\" (UniqueName: \"kubernetes.io/projected/45b84e9a-bab4-4169-a299-d5133d490692-kube-api-access-79ngm\") on node \"crc\" DevicePath \"\"" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.743408 4722 generic.go:334] "Generic (PLEG): container finished" podID="45b84e9a-bab4-4169-a299-d5133d490692" containerID="7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb" exitCode=0 Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.743494 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerDied","Data":"7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb"} Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.743527 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerDied","Data":"4c789c4a7e2ff59a7b9107b416dc9d6defa0e80bd77a26652a24c8518e46ceab"} Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.743549 4722 scope.go:117] "RemoveContainer" containerID="7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.743716 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.796277 4722 scope.go:117] "RemoveContainer" containerID="1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.807842 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gjpnv"] Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.818612 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gjpnv"] Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.820256 4722 scope.go:117] "RemoveContainer" containerID="114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.867071 4722 scope.go:117] "RemoveContainer" containerID="7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb" Feb 26 20:21:02 crc kubenswrapper[4722]: E0226 20:21:02.867773 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb\": container with ID starting with 7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb not found: ID does not exist" containerID="7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.867828 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb"} err="failed to get container status \"7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb\": rpc error: code = NotFound desc = could not find container \"7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb\": container with ID starting with 7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb not found: ID does not exist" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.867879 4722 scope.go:117] "RemoveContainer" containerID="1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36" Feb 26 20:21:02 crc kubenswrapper[4722]: E0226 20:21:02.869396 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36\": container with ID starting with 1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36 not found: ID does not exist" containerID="1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.869446 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36"} err="failed to get container status \"1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36\": rpc error: code = NotFound desc = could not find container \"1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36\": container with ID starting with 1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36 not found: ID does not exist" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.869465 4722 scope.go:117] "RemoveContainer" containerID="114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3" Feb 26 20:21:02 crc kubenswrapper[4722]: E0226 20:21:02.869935 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3\": container with ID starting with 114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3 not found: ID does not exist" containerID="114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.869962 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3"} err="failed to get container status \"114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3\": rpc error: code = NotFound desc = could not find container \"114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3\": container with ID starting with 114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3 not found: ID does not exist" Feb 26 20:21:03 crc kubenswrapper[4722]: I0226 20:21:03.147032 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:21:03 crc kubenswrapper[4722]: E0226 20:21:03.147440 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:21:04 crc kubenswrapper[4722]: I0226 20:21:04.167764 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45b84e9a-bab4-4169-a299-d5133d490692" path="/var/lib/kubelet/pods/45b84e9a-bab4-4169-a299-d5133d490692/volumes" Feb 26 20:21:15 crc kubenswrapper[4722]: I0226 20:21:15.146300 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:21:15 crc kubenswrapper[4722]: E0226 20:21:15.148305 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:21:24 crc kubenswrapper[4722]: I0226 20:21:24.150205 4722 scope.go:117] "RemoveContainer" containerID="6f2e937ad24a94d5ba509309725af0a7a53c1461974805f8d6a685fb89a60bed" Feb 26 20:21:24 crc kubenswrapper[4722]: I0226 20:21:24.197481 4722 scope.go:117] "RemoveContainer" containerID="78648c5124c5c41a097259e060a38a160dde3bbb1322966d64b1b455562baa7d" Feb 26 20:21:29 crc kubenswrapper[4722]: I0226 20:21:29.147432 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:21:29 crc kubenswrapper[4722]: E0226 20:21:29.148671 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.787039 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h92mt"] Feb 26 20:21:40 crc kubenswrapper[4722]: E0226 20:21:40.788076 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="extract-utilities" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.788091 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="extract-utilities" Feb 26 20:21:40 crc kubenswrapper[4722]: E0226 20:21:40.788108 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="extract-content" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.788114 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="extract-content" Feb 26 20:21:40 crc kubenswrapper[4722]: E0226 20:21:40.788183 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="registry-server" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.788192 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="registry-server" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.788593 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="registry-server" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.790112 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.803853 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h92mt"] Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.944151 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-catalog-content\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.944200 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-utilities\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.944723 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c42cm\" (UniqueName: \"kubernetes.io/projected/b1c54ad1-434c-4c0a-b220-b63c25333dcf-kube-api-access-c42cm\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.046876 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c42cm\" (UniqueName: \"kubernetes.io/projected/b1c54ad1-434c-4c0a-b220-b63c25333dcf-kube-api-access-c42cm\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.047358 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-catalog-content\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.047413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-utilities\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.048049 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-utilities\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.048170 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-catalog-content\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.065730 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c42cm\" (UniqueName: \"kubernetes.io/projected/b1c54ad1-434c-4c0a-b220-b63c25333dcf-kube-api-access-c42cm\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.109432 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.560103 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h92mt"] Feb 26 20:21:42 crc kubenswrapper[4722]: I0226 20:21:42.210886 4722 generic.go:334] "Generic (PLEG): container finished" podID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerID="ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634" exitCode=0 Feb 26 20:21:42 crc kubenswrapper[4722]: I0226 20:21:42.210954 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h92mt" event={"ID":"b1c54ad1-434c-4c0a-b220-b63c25333dcf","Type":"ContainerDied","Data":"ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634"} Feb 26 20:21:42 crc kubenswrapper[4722]: I0226 20:21:42.210999 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h92mt" event={"ID":"b1c54ad1-434c-4c0a-b220-b63c25333dcf","Type":"ContainerStarted","Data":"82acc75d7b84b6c18844311b195ab703c4f8e7088c2b66994ebacf88f0fbb040"} Feb 26 20:21:43 crc kubenswrapper[4722]: I0226 20:21:43.150787 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:21:43 crc kubenswrapper[4722]: E0226 20:21:43.151532 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:21:44 crc kubenswrapper[4722]: I0226 20:21:44.250743 4722 generic.go:334] "Generic (PLEG): container finished" podID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerID="14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370" exitCode=0 Feb 26 20:21:44 crc kubenswrapper[4722]: I0226 20:21:44.250848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h92mt" event={"ID":"b1c54ad1-434c-4c0a-b220-b63c25333dcf","Type":"ContainerDied","Data":"14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370"} Feb 26 20:21:45 crc kubenswrapper[4722]: I0226 20:21:45.262781 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h92mt" event={"ID":"b1c54ad1-434c-4c0a-b220-b63c25333dcf","Type":"ContainerStarted","Data":"c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250"} Feb 26 20:21:45 crc kubenswrapper[4722]: I0226 20:21:45.286580 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h92mt" podStartSLOduration=2.8588639860000002 podStartE2EDuration="5.286563196s" podCreationTimestamp="2026-02-26 20:21:40 +0000 UTC" firstStartedPulling="2026-02-26 20:21:42.214430951 +0000 UTC m=+1644.751398885" lastFinishedPulling="2026-02-26 20:21:44.642130171 +0000 UTC m=+1647.179098095" observedRunningTime="2026-02-26 20:21:45.280914655 +0000 UTC m=+1647.817882599" watchObservedRunningTime="2026-02-26 20:21:45.286563196 +0000 UTC m=+1647.823531120" Feb 26 20:21:51 crc kubenswrapper[4722]: I0226 20:21:51.110171 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:51 crc kubenswrapper[4722]: I0226 20:21:51.110743 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:51 crc kubenswrapper[4722]: I0226 20:21:51.156806 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:51 crc kubenswrapper[4722]: I0226 20:21:51.384182 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:51 crc kubenswrapper[4722]: I0226 20:21:51.440955 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h92mt"] Feb 26 20:21:53 crc kubenswrapper[4722]: I0226 20:21:53.337940 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h92mt" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="registry-server" containerID="cri-o://c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250" gracePeriod=2 Feb 26 20:21:53 crc kubenswrapper[4722]: I0226 20:21:53.931264 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.110840 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-utilities\") pod \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.111273 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c42cm\" (UniqueName: \"kubernetes.io/projected/b1c54ad1-434c-4c0a-b220-b63c25333dcf-kube-api-access-c42cm\") pod \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.111349 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-catalog-content\") pod \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.112270 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-utilities" (OuterVolumeSpecName: "utilities") pod "b1c54ad1-434c-4c0a-b220-b63c25333dcf" (UID: "b1c54ad1-434c-4c0a-b220-b63c25333dcf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.123302 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1c54ad1-434c-4c0a-b220-b63c25333dcf-kube-api-access-c42cm" (OuterVolumeSpecName: "kube-api-access-c42cm") pod "b1c54ad1-434c-4c0a-b220-b63c25333dcf" (UID: "b1c54ad1-434c-4c0a-b220-b63c25333dcf"). InnerVolumeSpecName "kube-api-access-c42cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.146767 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:21:54 crc kubenswrapper[4722]: E0226 20:21:54.147295 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.156879 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1c54ad1-434c-4c0a-b220-b63c25333dcf" (UID: "b1c54ad1-434c-4c0a-b220-b63c25333dcf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.214523 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.214588 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c42cm\" (UniqueName: \"kubernetes.io/projected/b1c54ad1-434c-4c0a-b220-b63c25333dcf-kube-api-access-c42cm\") on node \"crc\" DevicePath \"\"" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.214601 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.361035 4722 generic.go:334] "Generic (PLEG): container finished" podID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerID="c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250" exitCode=0 Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.361079 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h92mt" event={"ID":"b1c54ad1-434c-4c0a-b220-b63c25333dcf","Type":"ContainerDied","Data":"c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250"} Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.361106 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h92mt" event={"ID":"b1c54ad1-434c-4c0a-b220-b63c25333dcf","Type":"ContainerDied","Data":"82acc75d7b84b6c18844311b195ab703c4f8e7088c2b66994ebacf88f0fbb040"} Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.361122 4722 scope.go:117] "RemoveContainer" containerID="c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.361128 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.390282 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h92mt"] Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.400840 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h92mt"] Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.412888 4722 scope.go:117] "RemoveContainer" containerID="14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.470979 4722 scope.go:117] "RemoveContainer" containerID="ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.519348 4722 scope.go:117] "RemoveContainer" containerID="c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250" Feb 26 20:21:54 crc kubenswrapper[4722]: E0226 20:21:54.520649 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250\": container with ID starting with c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250 not found: ID does not exist" containerID="c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.520705 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250"} err="failed to get container status \"c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250\": rpc error: code = NotFound desc = could not find container \"c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250\": container with ID starting with c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250 not found: ID does not exist" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.520739 4722 scope.go:117] "RemoveContainer" containerID="14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370" Feb 26 20:21:54 crc kubenswrapper[4722]: E0226 20:21:54.521540 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370\": container with ID starting with 14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370 not found: ID does not exist" containerID="14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.521570 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370"} err="failed to get container status \"14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370\": rpc error: code = NotFound desc = could not find container \"14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370\": container with ID starting with 14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370 not found: ID does not exist" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.521590 4722 scope.go:117] "RemoveContainer" containerID="ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634" Feb 26 20:21:54 crc kubenswrapper[4722]: E0226 20:21:54.521853 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634\": container with ID starting with ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634 not found: ID does not exist" containerID="ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.521893 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634"} err="failed to get container status \"ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634\": rpc error: code = NotFound desc = could not find container \"ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634\": container with ID starting with ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634 not found: ID does not exist" Feb 26 20:21:56 crc kubenswrapper[4722]: I0226 20:21:56.162029 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" path="/var/lib/kubelet/pods/b1c54ad1-434c-4c0a-b220-b63c25333dcf/volumes" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.190048 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535622-nz8ch"] Feb 26 20:22:00 crc kubenswrapper[4722]: E0226 20:22:00.191458 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="extract-content" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.191492 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="extract-content" Feb 26 20:22:00 crc kubenswrapper[4722]: E0226 20:22:00.191527 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="extract-utilities" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.191536 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="extract-utilities" Feb 26 20:22:00 crc kubenswrapper[4722]: E0226 20:22:00.191617 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="registry-server" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.191626 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="registry-server" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.192181 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="registry-server" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.193306 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535622-nz8ch"] Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.193394 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.196474 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.197340 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.197504 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.392782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg9wp\" (UniqueName: \"kubernetes.io/projected/f4cf0607-aae4-41cb-9515-5669ed2a4235-kube-api-access-lg9wp\") pod \"auto-csr-approver-29535622-nz8ch\" (UID: \"f4cf0607-aae4-41cb-9515-5669ed2a4235\") " pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.495218 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg9wp\" (UniqueName: \"kubernetes.io/projected/f4cf0607-aae4-41cb-9515-5669ed2a4235-kube-api-access-lg9wp\") pod \"auto-csr-approver-29535622-nz8ch\" (UID: \"f4cf0607-aae4-41cb-9515-5669ed2a4235\") " pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.526471 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg9wp\" (UniqueName: \"kubernetes.io/projected/f4cf0607-aae4-41cb-9515-5669ed2a4235-kube-api-access-lg9wp\") pod \"auto-csr-approver-29535622-nz8ch\" (UID: \"f4cf0607-aae4-41cb-9515-5669ed2a4235\") " pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.816374 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:01 crc kubenswrapper[4722]: I0226 20:22:01.318471 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535622-nz8ch"] Feb 26 20:22:01 crc kubenswrapper[4722]: I0226 20:22:01.442496 4722 generic.go:334] "Generic (PLEG): container finished" podID="7aea65fe-4b22-44f8-b756-2ee54c916c8a" containerID="2e665c0b7dcef25f73d8804548acdfb65f3b5d949af179abfee81b7436428b50" exitCode=0 Feb 26 20:22:01 crc kubenswrapper[4722]: I0226 20:22:01.442589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" event={"ID":"7aea65fe-4b22-44f8-b756-2ee54c916c8a","Type":"ContainerDied","Data":"2e665c0b7dcef25f73d8804548acdfb65f3b5d949af179abfee81b7436428b50"} Feb 26 20:22:01 crc kubenswrapper[4722]: I0226 20:22:01.445188 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" event={"ID":"f4cf0607-aae4-41cb-9515-5669ed2a4235","Type":"ContainerStarted","Data":"e0de263cf8eb5c4b3750ae0c00d39defcb82d377fe069735c1b20d59f9b61fd2"} Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.012268 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.146746 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plkz9\" (UniqueName: \"kubernetes.io/projected/7aea65fe-4b22-44f8-b756-2ee54c916c8a-kube-api-access-plkz9\") pod \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.146796 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-bootstrap-combined-ca-bundle\") pod \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.146883 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-inventory\") pod \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.147005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam\") pod \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.163503 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7aea65fe-4b22-44f8-b756-2ee54c916c8a" (UID: "7aea65fe-4b22-44f8-b756-2ee54c916c8a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.163536 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aea65fe-4b22-44f8-b756-2ee54c916c8a-kube-api-access-plkz9" (OuterVolumeSpecName: "kube-api-access-plkz9") pod "7aea65fe-4b22-44f8-b756-2ee54c916c8a" (UID: "7aea65fe-4b22-44f8-b756-2ee54c916c8a"). InnerVolumeSpecName "kube-api-access-plkz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:22:03 crc kubenswrapper[4722]: E0226 20:22:03.175002 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam podName:7aea65fe-4b22-44f8-b756-2ee54c916c8a nodeName:}" failed. No retries permitted until 2026-02-26 20:22:03.674969753 +0000 UTC m=+1666.211937697 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam") pod "7aea65fe-4b22-44f8-b756-2ee54c916c8a" (UID: "7aea65fe-4b22-44f8-b756-2ee54c916c8a") : error deleting /var/lib/kubelet/pods/7aea65fe-4b22-44f8-b756-2ee54c916c8a/volume-subpaths: remove /var/lib/kubelet/pods/7aea65fe-4b22-44f8-b756-2ee54c916c8a/volume-subpaths: no such file or directory Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.178459 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-inventory" (OuterVolumeSpecName: "inventory") pod "7aea65fe-4b22-44f8-b756-2ee54c916c8a" (UID: "7aea65fe-4b22-44f8-b756-2ee54c916c8a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.249847 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.250319 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plkz9\" (UniqueName: \"kubernetes.io/projected/7aea65fe-4b22-44f8-b756-2ee54c916c8a-kube-api-access-plkz9\") on node \"crc\" DevicePath \"\"" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.250331 4722 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.463279 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4cf0607-aae4-41cb-9515-5669ed2a4235" containerID="672192e703cf8fa85afac0c8cd463702434e5ae8f105603e0cc9cfafc0a59493" exitCode=0 Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.463342 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" event={"ID":"f4cf0607-aae4-41cb-9515-5669ed2a4235","Type":"ContainerDied","Data":"672192e703cf8fa85afac0c8cd463702434e5ae8f105603e0cc9cfafc0a59493"} Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.465188 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" event={"ID":"7aea65fe-4b22-44f8-b756-2ee54c916c8a","Type":"ContainerDied","Data":"95b6b4d34df6c168cfba7dd94de0b62fa7dcdcffd63be8b9be88180eaeda8b7d"} Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.465223 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95b6b4d34df6c168cfba7dd94de0b62fa7dcdcffd63be8b9be88180eaeda8b7d" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.465239 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.556187 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx"] Feb 26 20:22:03 crc kubenswrapper[4722]: E0226 20:22:03.556704 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aea65fe-4b22-44f8-b756-2ee54c916c8a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.556729 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aea65fe-4b22-44f8-b756-2ee54c916c8a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.557004 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aea65fe-4b22-44f8-b756-2ee54c916c8a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.558021 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.577599 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx"] Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.660112 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njczs\" (UniqueName: \"kubernetes.io/projected/8d72a53a-52c1-427e-a1be-81a00129c7bd-kube-api-access-njczs\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.660185 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.660411 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.761378 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam\") pod \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.761729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njczs\" (UniqueName: \"kubernetes.io/projected/8d72a53a-52c1-427e-a1be-81a00129c7bd-kube-api-access-njczs\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.761763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.761818 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.766920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.766933 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.774331 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7aea65fe-4b22-44f8-b756-2ee54c916c8a" (UID: "7aea65fe-4b22-44f8-b756-2ee54c916c8a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.781617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njczs\" (UniqueName: \"kubernetes.io/projected/8d72a53a-52c1-427e-a1be-81a00129c7bd-kube-api-access-njczs\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.864332 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.883367 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:04 crc kubenswrapper[4722]: I0226 20:22:04.445945 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx"] Feb 26 20:22:04 crc kubenswrapper[4722]: W0226 20:22:04.453254 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d72a53a_52c1_427e_a1be_81a00129c7bd.slice/crio-064d7608d2aaf9fe535251a1f20da90054df3f65b4afed5ec20b84f407d92224 WatchSource:0}: Error finding container 064d7608d2aaf9fe535251a1f20da90054df3f65b4afed5ec20b84f407d92224: Status 404 returned error can't find the container with id 064d7608d2aaf9fe535251a1f20da90054df3f65b4afed5ec20b84f407d92224 Feb 26 20:22:04 crc kubenswrapper[4722]: I0226 20:22:04.495630 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" event={"ID":"8d72a53a-52c1-427e-a1be-81a00129c7bd","Type":"ContainerStarted","Data":"064d7608d2aaf9fe535251a1f20da90054df3f65b4afed5ec20b84f407d92224"} Feb 26 20:22:04 crc kubenswrapper[4722]: I0226 20:22:04.890612 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.088604 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg9wp\" (UniqueName: \"kubernetes.io/projected/f4cf0607-aae4-41cb-9515-5669ed2a4235-kube-api-access-lg9wp\") pod \"f4cf0607-aae4-41cb-9515-5669ed2a4235\" (UID: \"f4cf0607-aae4-41cb-9515-5669ed2a4235\") " Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.093578 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4cf0607-aae4-41cb-9515-5669ed2a4235-kube-api-access-lg9wp" (OuterVolumeSpecName: "kube-api-access-lg9wp") pod "f4cf0607-aae4-41cb-9515-5669ed2a4235" (UID: "f4cf0607-aae4-41cb-9515-5669ed2a4235"). InnerVolumeSpecName "kube-api-access-lg9wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.146354 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:22:05 crc kubenswrapper[4722]: E0226 20:22:05.146619 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.193738 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg9wp\" (UniqueName: \"kubernetes.io/projected/f4cf0607-aae4-41cb-9515-5669ed2a4235-kube-api-access-lg9wp\") on node \"crc\" DevicePath \"\"" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.508538 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" event={"ID":"8d72a53a-52c1-427e-a1be-81a00129c7bd","Type":"ContainerStarted","Data":"ee207d31bf31773a14d88aae3243f7b82feaf6e081428bd47fd2e5936c33aaab"} Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.511089 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" event={"ID":"f4cf0607-aae4-41cb-9515-5669ed2a4235","Type":"ContainerDied","Data":"e0de263cf8eb5c4b3750ae0c00d39defcb82d377fe069735c1b20d59f9b61fd2"} Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.511192 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0de263cf8eb5c4b3750ae0c00d39defcb82d377fe069735c1b20d59f9b61fd2" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.511271 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.557781 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" podStartSLOduration=2.082031546 podStartE2EDuration="2.557760642s" podCreationTimestamp="2026-02-26 20:22:03 +0000 UTC" firstStartedPulling="2026-02-26 20:22:04.455944902 +0000 UTC m=+1666.992912826" lastFinishedPulling="2026-02-26 20:22:04.931673998 +0000 UTC m=+1667.468641922" observedRunningTime="2026-02-26 20:22:05.528796558 +0000 UTC m=+1668.065764512" watchObservedRunningTime="2026-02-26 20:22:05.557760642 +0000 UTC m=+1668.094728586" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.977818 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535616-66blr"] Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.990039 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535616-66blr"] Feb 26 20:22:06 crc kubenswrapper[4722]: I0226 20:22:06.169759 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d98b84a0-bedf-45f7-b9ca-14244b272795" path="/var/lib/kubelet/pods/d98b84a0-bedf-45f7-b9ca-14244b272795/volumes" Feb 26 20:22:18 crc kubenswrapper[4722]: I0226 20:22:18.152357 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:22:18 crc kubenswrapper[4722]: E0226 20:22:18.154367 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:22:24 crc kubenswrapper[4722]: I0226 20:22:24.355307 4722 scope.go:117] "RemoveContainer" containerID="85e8f05367a744c0f4e09a6527c065c7997ae1eeef2dfa520172d997309a69d0" Feb 26 20:22:24 crc kubenswrapper[4722]: I0226 20:22:24.380569 4722 scope.go:117] "RemoveContainer" containerID="29820dcf2231bb5b66e448a2cd4fa48f3786d147a2370ac7764d15a35e5be118" Feb 26 20:22:24 crc kubenswrapper[4722]: I0226 20:22:24.409497 4722 scope.go:117] "RemoveContainer" containerID="5de6c6e11809b987a2283569350c76b045aacb18e1459f5a68ca1b9956ac0606" Feb 26 20:22:24 crc kubenswrapper[4722]: I0226 20:22:24.490814 4722 scope.go:117] "RemoveContainer" containerID="81fe767a7e621adb64ce8e5396af5dd28bd140b17e573360f334905d10b289a2" Feb 26 20:22:24 crc kubenswrapper[4722]: I0226 20:22:24.543247 4722 scope.go:117] "RemoveContainer" containerID="ac2f3b07e9cb38292f9a8f116cc36245d0cb46f50c0d7e8903e1155048757f1f" Feb 26 20:22:31 crc kubenswrapper[4722]: I0226 20:22:31.146813 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:22:31 crc kubenswrapper[4722]: E0226 20:22:31.147952 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:22:43 crc kubenswrapper[4722]: I0226 20:22:43.146347 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:22:43 crc kubenswrapper[4722]: E0226 20:22:43.147276 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:22:58 crc kubenswrapper[4722]: I0226 20:22:58.154052 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:22:58 crc kubenswrapper[4722]: E0226 20:22:58.154880 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.047226 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8121-account-create-update-lqcpn"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.057871 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-fq8ft"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.069415 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-40c9-account-create-update-6b2zr"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.081036 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b267-account-create-update-h956k"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.091475 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8121-account-create-update-lqcpn"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.102069 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-42ds6"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.111936 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lrpx8"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.122396 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-fq8ft"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.133797 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lrpx8"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.142727 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b267-account-create-update-h956k"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.156939 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66980b23-7973-4558-91ba-6f53c2ad7046" path="/var/lib/kubelet/pods/66980b23-7973-4558-91ba-6f53c2ad7046/volumes" Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.157523 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bdabe92-f114-4ce7-a52d-af8c640bf2ae" path="/var/lib/kubelet/pods/7bdabe92-f114-4ce7-a52d-af8c640bf2ae/volumes" Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.158045 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e110b2fa-c2a9-482e-9b60-8ca117d38d87" path="/var/lib/kubelet/pods/e110b2fa-c2a9-482e-9b60-8ca117d38d87/volumes" Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.159294 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4ffd934-6139-4ef1-92b2-a30b7798fe61" path="/var/lib/kubelet/pods/f4ffd934-6139-4ef1-92b2-a30b7798fe61/volumes" Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.161014 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-40c9-account-create-update-6b2zr"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.162120 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-42ds6"] Feb 26 20:23:02 crc kubenswrapper[4722]: I0226 20:23:02.157426 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12bb8485-56aa-436e-abd8-5e63601f2ab8" path="/var/lib/kubelet/pods/12bb8485-56aa-436e-abd8-5e63601f2ab8/volumes" Feb 26 20:23:02 crc kubenswrapper[4722]: I0226 20:23:02.158400 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb306548-9870-4ef0-ae38-af8d1edc3c3a" path="/var/lib/kubelet/pods/cb306548-9870-4ef0-ae38-af8d1edc3c3a/volumes" Feb 26 20:23:09 crc kubenswrapper[4722]: I0226 20:23:09.146697 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:23:09 crc kubenswrapper[4722]: E0226 20:23:09.147500 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:23:24 crc kubenswrapper[4722]: I0226 20:23:24.625166 4722 scope.go:117] "RemoveContainer" containerID="c1ecedd1e5644d22571990b546292504e5dec5b4f6c887aa8a5adff38a5a0fdd" Feb 26 20:23:24 crc kubenswrapper[4722]: I0226 20:23:24.650232 4722 scope.go:117] "RemoveContainer" containerID="03c0e9cafbb16524123251a72faebfd56b790a7d3c3949a0898be78d71e46f98" Feb 26 20:23:24 crc kubenswrapper[4722]: I0226 20:23:24.722053 4722 scope.go:117] "RemoveContainer" containerID="73385e0a6d5faee7dda2cbf3c7f647f0df7ca4be5684e9f26704a5d5c465e2d7" Feb 26 20:23:24 crc kubenswrapper[4722]: I0226 20:23:24.759405 4722 scope.go:117] "RemoveContainer" containerID="e9b886aa3352276ce6e04a2d381be311e3886f3dacfad947d148eba89f4cfc67" Feb 26 20:23:24 crc kubenswrapper[4722]: I0226 20:23:24.812786 4722 scope.go:117] "RemoveContainer" containerID="f6de72bcdbf9ee781ec77b46bc1f5d6b13a76082e5b862f171620c00f731cba2" Feb 26 20:23:24 crc kubenswrapper[4722]: I0226 20:23:24.866616 4722 scope.go:117] "RemoveContainer" containerID="0e957345181f767224843febfcb90e7ba6f6a6f89646a5c7d2e021dce436bbf2" Feb 26 20:23:25 crc kubenswrapper[4722]: I0226 20:23:25.037004 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-aee4-account-create-update-pdt89"] Feb 26 20:23:25 crc kubenswrapper[4722]: I0226 20:23:25.047019 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-aee4-account-create-update-pdt89"] Feb 26 20:23:25 crc kubenswrapper[4722]: I0226 20:23:25.146682 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:23:25 crc kubenswrapper[4722]: E0226 20:23:25.146970 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:23:26 crc kubenswrapper[4722]: I0226 20:23:26.165708 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d56edfd6-ff9d-4a81-820c-250a94048683" path="/var/lib/kubelet/pods/d56edfd6-ff9d-4a81-820c-250a94048683/volumes" Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.041219 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-974a-account-create-update-bszfn"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.053440 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-667ht"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.066531 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-974a-account-create-update-bszfn"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.077064 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-667ht"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.086450 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-qtmxl"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.095500 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-gdd4v"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.104117 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-qtmxl"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.112864 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-0ff4-account-create-update-t2c7j"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.122266 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-gdd4v"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.132314 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cg47w"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.171349 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2842874a-dd3a-44ba-ba7e-e0d8f41be944" path="/var/lib/kubelet/pods/2842874a-dd3a-44ba-ba7e-e0d8f41be944/volumes" Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.172219 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3059b1f6-b323-4632-8296-c4eec81bb239" path="/var/lib/kubelet/pods/3059b1f6-b323-4632-8296-c4eec81bb239/volumes" Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.173259 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4091b496-0010-42d3-97d6-281d47ae3f1c" path="/var/lib/kubelet/pods/4091b496-0010-42d3-97d6-281d47ae3f1c/volumes" Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.188780 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" path="/var/lib/kubelet/pods/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5/volumes" Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.189630 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-0ff4-account-create-update-t2c7j"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.189746 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3385-account-create-update-qdqpt"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.193932 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cg47w"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.203768 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3385-account-create-update-qdqpt"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.213237 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-xkflz"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.222056 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-xkflz"] Feb 26 20:23:30 crc kubenswrapper[4722]: I0226 20:23:30.159411 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50d98fd3-85f9-400a-9492-7add2a485d7c" path="/var/lib/kubelet/pods/50d98fd3-85f9-400a-9492-7add2a485d7c/volumes" Feb 26 20:23:30 crc kubenswrapper[4722]: I0226 20:23:30.160539 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8205614-2f8f-4d32-8522-e76f6e7b9c69" path="/var/lib/kubelet/pods/d8205614-2f8f-4d32-8522-e76f6e7b9c69/volumes" Feb 26 20:23:30 crc kubenswrapper[4722]: I0226 20:23:30.162086 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2de5980-b357-42e1-8630-ea5b2751f224" path="/var/lib/kubelet/pods/e2de5980-b357-42e1-8630-ea5b2751f224/volumes" Feb 26 20:23:30 crc kubenswrapper[4722]: I0226 20:23:30.162674 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4315c1e-5007-4f92-b729-ac02cfdbc2ce" path="/var/lib/kubelet/pods/e4315c1e-5007-4f92-b729-ac02cfdbc2ce/volumes" Feb 26 20:23:37 crc kubenswrapper[4722]: I0226 20:23:37.040002 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-x7zlz"] Feb 26 20:23:37 crc kubenswrapper[4722]: I0226 20:23:37.051967 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-x7zlz"] Feb 26 20:23:38 crc kubenswrapper[4722]: I0226 20:23:38.157133 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64b602b0-4c3e-4f7b-a1e8-961510e33097" path="/var/lib/kubelet/pods/64b602b0-4c3e-4f7b-a1e8-961510e33097/volumes" Feb 26 20:23:40 crc kubenswrapper[4722]: I0226 20:23:40.146193 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:23:40 crc kubenswrapper[4722]: E0226 20:23:40.146897 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:23:43 crc kubenswrapper[4722]: I0226 20:23:43.662579 4722 generic.go:334] "Generic (PLEG): container finished" podID="8d72a53a-52c1-427e-a1be-81a00129c7bd" containerID="ee207d31bf31773a14d88aae3243f7b82feaf6e081428bd47fd2e5936c33aaab" exitCode=0 Feb 26 20:23:43 crc kubenswrapper[4722]: I0226 20:23:43.662674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" event={"ID":"8d72a53a-52c1-427e-a1be-81a00129c7bd","Type":"ContainerDied","Data":"ee207d31bf31773a14d88aae3243f7b82feaf6e081428bd47fd2e5936c33aaab"} Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.263006 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.398688 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-ssh-key-openstack-edpm-ipam\") pod \"8d72a53a-52c1-427e-a1be-81a00129c7bd\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.398800 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njczs\" (UniqueName: \"kubernetes.io/projected/8d72a53a-52c1-427e-a1be-81a00129c7bd-kube-api-access-njczs\") pod \"8d72a53a-52c1-427e-a1be-81a00129c7bd\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.398834 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-inventory\") pod \"8d72a53a-52c1-427e-a1be-81a00129c7bd\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.404490 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d72a53a-52c1-427e-a1be-81a00129c7bd-kube-api-access-njczs" (OuterVolumeSpecName: "kube-api-access-njczs") pod "8d72a53a-52c1-427e-a1be-81a00129c7bd" (UID: "8d72a53a-52c1-427e-a1be-81a00129c7bd"). InnerVolumeSpecName "kube-api-access-njczs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.427203 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-inventory" (OuterVolumeSpecName: "inventory") pod "8d72a53a-52c1-427e-a1be-81a00129c7bd" (UID: "8d72a53a-52c1-427e-a1be-81a00129c7bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.428795 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8d72a53a-52c1-427e-a1be-81a00129c7bd" (UID: "8d72a53a-52c1-427e-a1be-81a00129c7bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.500992 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.501029 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njczs\" (UniqueName: \"kubernetes.io/projected/8d72a53a-52c1-427e-a1be-81a00129c7bd-kube-api-access-njczs\") on node \"crc\" DevicePath \"\"" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.501042 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.683913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" event={"ID":"8d72a53a-52c1-427e-a1be-81a00129c7bd","Type":"ContainerDied","Data":"064d7608d2aaf9fe535251a1f20da90054df3f65b4afed5ec20b84f407d92224"} Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.684187 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="064d7608d2aaf9fe535251a1f20da90054df3f65b4afed5ec20b84f407d92224" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.683957 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.761768 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m"] Feb 26 20:23:45 crc kubenswrapper[4722]: E0226 20:23:45.762251 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4cf0607-aae4-41cb-9515-5669ed2a4235" containerName="oc" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.762275 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4cf0607-aae4-41cb-9515-5669ed2a4235" containerName="oc" Feb 26 20:23:45 crc kubenswrapper[4722]: E0226 20:23:45.762301 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d72a53a-52c1-427e-a1be-81a00129c7bd" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.762311 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d72a53a-52c1-427e-a1be-81a00129c7bd" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.762554 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4cf0607-aae4-41cb-9515-5669ed2a4235" containerName="oc" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.762585 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d72a53a-52c1-427e-a1be-81a00129c7bd" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.763754 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.766756 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.766970 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.767126 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.767295 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.781214 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m"] Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.911496 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.911602 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhz26\" (UniqueName: \"kubernetes.io/projected/19a53cda-4020-471d-a7f3-6e410ae94b65-kube-api-access-mhz26\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.911687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.013557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.013715 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.013774 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhz26\" (UniqueName: \"kubernetes.io/projected/19a53cda-4020-471d-a7f3-6e410ae94b65-kube-api-access-mhz26\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.018168 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.023759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.043401 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhz26\" (UniqueName: \"kubernetes.io/projected/19a53cda-4020-471d-a7f3-6e410ae94b65-kube-api-access-mhz26\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.080236 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.649625 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m"] Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.693876 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" event={"ID":"19a53cda-4020-471d-a7f3-6e410ae94b65","Type":"ContainerStarted","Data":"df0e39c43be91b11389d8075ffd9e5ba003ec06f9f86c2b5085666af4603384e"} Feb 26 20:23:47 crc kubenswrapper[4722]: I0226 20:23:47.715643 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" event={"ID":"19a53cda-4020-471d-a7f3-6e410ae94b65","Type":"ContainerStarted","Data":"431b3af6bf3ce2b96e141a6a7a02b1fbd2b696080ef1618004d17403d0402240"} Feb 26 20:23:47 crc kubenswrapper[4722]: I0226 20:23:47.746712 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" podStartSLOduration=2.3030960719999998 podStartE2EDuration="2.746691708s" podCreationTimestamp="2026-02-26 20:23:45 +0000 UTC" firstStartedPulling="2026-02-26 20:23:46.65352002 +0000 UTC m=+1769.190487944" lastFinishedPulling="2026-02-26 20:23:47.097115636 +0000 UTC m=+1769.634083580" observedRunningTime="2026-02-26 20:23:47.734669177 +0000 UTC m=+1770.271637111" watchObservedRunningTime="2026-02-26 20:23:47.746691708 +0000 UTC m=+1770.283659642" Feb 26 20:23:52 crc kubenswrapper[4722]: I0226 20:23:52.146083 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:23:52 crc kubenswrapper[4722]: E0226 20:23:52.146841 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:23:57 crc kubenswrapper[4722]: I0226 20:23:57.027897 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-n5jvb"] Feb 26 20:23:57 crc kubenswrapper[4722]: I0226 20:23:57.036591 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-n5jvb"] Feb 26 20:23:58 crc kubenswrapper[4722]: I0226 20:23:58.158600 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" path="/var/lib/kubelet/pods/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf/volumes" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.140696 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535624-fp9nm"] Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.142489 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.144904 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.144933 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.156902 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.157367 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535624-fp9nm"] Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.160267 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rl5z\" (UniqueName: \"kubernetes.io/projected/37270d6e-59ab-4ed7-872d-629514b0727b-kube-api-access-9rl5z\") pod \"auto-csr-approver-29535624-fp9nm\" (UID: \"37270d6e-59ab-4ed7-872d-629514b0727b\") " pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.262708 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rl5z\" (UniqueName: \"kubernetes.io/projected/37270d6e-59ab-4ed7-872d-629514b0727b-kube-api-access-9rl5z\") pod \"auto-csr-approver-29535624-fp9nm\" (UID: \"37270d6e-59ab-4ed7-872d-629514b0727b\") " pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.282418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rl5z\" (UniqueName: \"kubernetes.io/projected/37270d6e-59ab-4ed7-872d-629514b0727b-kube-api-access-9rl5z\") pod \"auto-csr-approver-29535624-fp9nm\" (UID: \"37270d6e-59ab-4ed7-872d-629514b0727b\") " pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.460976 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.930737 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535624-fp9nm"] Feb 26 20:24:00 crc kubenswrapper[4722]: W0226 20:24:00.936650 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37270d6e_59ab_4ed7_872d_629514b0727b.slice/crio-6d1c514a612f09ad8004323bb83782af86b1b8e288e4fec035f1c7d113f1f95f WatchSource:0}: Error finding container 6d1c514a612f09ad8004323bb83782af86b1b8e288e4fec035f1c7d113f1f95f: Status 404 returned error can't find the container with id 6d1c514a612f09ad8004323bb83782af86b1b8e288e4fec035f1c7d113f1f95f Feb 26 20:24:01 crc kubenswrapper[4722]: I0226 20:24:01.873285 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" event={"ID":"37270d6e-59ab-4ed7-872d-629514b0727b","Type":"ContainerStarted","Data":"6d1c514a612f09ad8004323bb83782af86b1b8e288e4fec035f1c7d113f1f95f"} Feb 26 20:24:02 crc kubenswrapper[4722]: I0226 20:24:02.883420 4722 generic.go:334] "Generic (PLEG): container finished" podID="37270d6e-59ab-4ed7-872d-629514b0727b" containerID="fb338752d8ecf09bc96fe18b7e92a49079b49e325de14c839174d5b1c91826af" exitCode=0 Feb 26 20:24:02 crc kubenswrapper[4722]: I0226 20:24:02.883473 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" event={"ID":"37270d6e-59ab-4ed7-872d-629514b0727b","Type":"ContainerDied","Data":"fb338752d8ecf09bc96fe18b7e92a49079b49e325de14c839174d5b1c91826af"} Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.295796 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.443262 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rl5z\" (UniqueName: \"kubernetes.io/projected/37270d6e-59ab-4ed7-872d-629514b0727b-kube-api-access-9rl5z\") pod \"37270d6e-59ab-4ed7-872d-629514b0727b\" (UID: \"37270d6e-59ab-4ed7-872d-629514b0727b\") " Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.457267 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37270d6e-59ab-4ed7-872d-629514b0727b-kube-api-access-9rl5z" (OuterVolumeSpecName: "kube-api-access-9rl5z") pod "37270d6e-59ab-4ed7-872d-629514b0727b" (UID: "37270d6e-59ab-4ed7-872d-629514b0727b"). InnerVolumeSpecName "kube-api-access-9rl5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.547060 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rl5z\" (UniqueName: \"kubernetes.io/projected/37270d6e-59ab-4ed7-872d-629514b0727b-kube-api-access-9rl5z\") on node \"crc\" DevicePath \"\"" Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.905434 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" event={"ID":"37270d6e-59ab-4ed7-872d-629514b0727b","Type":"ContainerDied","Data":"6d1c514a612f09ad8004323bb83782af86b1b8e288e4fec035f1c7d113f1f95f"} Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.905476 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d1c514a612f09ad8004323bb83782af86b1b8e288e4fec035f1c7d113f1f95f" Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.905504 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:05 crc kubenswrapper[4722]: I0226 20:24:05.146473 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:24:05 crc kubenswrapper[4722]: E0226 20:24:05.147087 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:24:05 crc kubenswrapper[4722]: I0226 20:24:05.382622 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535618-q6vg5"] Feb 26 20:24:05 crc kubenswrapper[4722]: I0226 20:24:05.392549 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535618-q6vg5"] Feb 26 20:24:06 crc kubenswrapper[4722]: I0226 20:24:06.159942 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e9c803-fc70-41f2-83a2-23e6917fa381" path="/var/lib/kubelet/pods/12e9c803-fc70-41f2-83a2-23e6917fa381/volumes" Feb 26 20:24:10 crc kubenswrapper[4722]: I0226 20:24:10.028123 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-b8gvr"] Feb 26 20:24:10 crc kubenswrapper[4722]: I0226 20:24:10.037164 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-b8gvr"] Feb 26 20:24:10 crc kubenswrapper[4722]: I0226 20:24:10.159720 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a5702a-6bfd-4f8d-a522-f0460c092b52" path="/var/lib/kubelet/pods/b8a5702a-6bfd-4f8d-a522-f0460c092b52/volumes" Feb 26 20:24:19 crc kubenswrapper[4722]: I0226 20:24:19.147697 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:24:19 crc kubenswrapper[4722]: E0226 20:24:19.148456 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:24:23 crc kubenswrapper[4722]: I0226 20:24:23.030514 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7s744"] Feb 26 20:24:23 crc kubenswrapper[4722]: I0226 20:24:23.039825 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7s744"] Feb 26 20:24:24 crc kubenswrapper[4722]: I0226 20:24:24.158380 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" path="/var/lib/kubelet/pods/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd/volumes" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.021378 4722 scope.go:117] "RemoveContainer" containerID="7eedf1d8a450400cc8704bf31ca7049a5d892d6f9798e46abaa6c5643c5ae1e5" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.053751 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-h94hg"] Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.065235 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-h94hg"] Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.092764 4722 scope.go:117] "RemoveContainer" containerID="97394bca4ea0c756dd461895d13cc98071cd1ae10d211edf48b9975466675e66" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.115404 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-79m6p"] Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.129595 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-79m6p"] Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.148121 4722 scope.go:117] "RemoveContainer" containerID="116a15c78f253ff12eb03dc128c2c8826ff24bd684f260eefceffd74fb2de9a5" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.177565 4722 scope.go:117] "RemoveContainer" containerID="7309364f193d7a19f0dbaf783411010ec7045e2d297bf72c99927634ee426f63" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.240662 4722 scope.go:117] "RemoveContainer" containerID="ac6fe4771c4ff85450d9e825c5b8afe616d23af31beaceaa0f7ed78aeb8a2a1d" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.264696 4722 scope.go:117] "RemoveContainer" containerID="fe4b785db865789897ad91e43ca2bc211b16e8b4ffce9f9cbf68c41de08cee41" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.316480 4722 scope.go:117] "RemoveContainer" containerID="db2672083ece02f74170f0c7cadfe50a27d9ef0c4917d7cd046cfc43ff213d6d" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.345790 4722 scope.go:117] "RemoveContainer" containerID="41d31fbcb037a00808ab448efcc9a72df78355f794fcbf9f3f37698a4a78afa6" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.366383 4722 scope.go:117] "RemoveContainer" containerID="deced704a3f40b9c7d276308aecb3a6d761c83341556aa3c96ad830a15d091b8" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.402786 4722 scope.go:117] "RemoveContainer" containerID="709af2229d82f6605eead0b8402fa51607ff6d782d4b599858bccedf6dadce4b" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.431479 4722 scope.go:117] "RemoveContainer" containerID="fdc3b554209a43390ea01e676568d1220b688044b067d00d45f3b650029baad6" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.450545 4722 scope.go:117] "RemoveContainer" containerID="fb6a21fe7ab70b142c6303b02630080c20f07f7547173986813cdd17ce919c8b" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.474625 4722 scope.go:117] "RemoveContainer" containerID="85e132ee56a366791bfb2a9d37f666669efa2791c2925f5341f7ea54f6cbacb3" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.501334 4722 scope.go:117] "RemoveContainer" containerID="30472cdf700f912bc5dcbe8f1046acb1daf64fba8373c1aa6e470fc71c0efe67" Feb 26 20:24:26 crc kubenswrapper[4722]: I0226 20:24:26.159564 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d551533-7396-4941-a62c-b1a0039f6ddc" path="/var/lib/kubelet/pods/3d551533-7396-4941-a62c-b1a0039f6ddc/volumes" Feb 26 20:24:26 crc kubenswrapper[4722]: I0226 20:24:26.161170 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7f3da1b-cb51-4235-8d61-d44ba069528c" path="/var/lib/kubelet/pods/f7f3da1b-cb51-4235-8d61-d44ba069528c/volumes" Feb 26 20:24:33 crc kubenswrapper[4722]: I0226 20:24:33.146245 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:24:33 crc kubenswrapper[4722]: E0226 20:24:33.147267 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:24:34 crc kubenswrapper[4722]: I0226 20:24:34.048717 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-m2kjh"] Feb 26 20:24:34 crc kubenswrapper[4722]: I0226 20:24:34.060591 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-m2kjh"] Feb 26 20:24:34 crc kubenswrapper[4722]: I0226 20:24:34.161373 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" path="/var/lib/kubelet/pods/0f37d21c-75cb-471a-b68c-db4207ba0f6b/volumes" Feb 26 20:24:46 crc kubenswrapper[4722]: I0226 20:24:46.146432 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:24:46 crc kubenswrapper[4722]: E0226 20:24:46.147607 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:24:56 crc kubenswrapper[4722]: I0226 20:24:56.480774 4722 generic.go:334] "Generic (PLEG): container finished" podID="19a53cda-4020-471d-a7f3-6e410ae94b65" containerID="431b3af6bf3ce2b96e141a6a7a02b1fbd2b696080ef1618004d17403d0402240" exitCode=0 Feb 26 20:24:56 crc kubenswrapper[4722]: I0226 20:24:56.481386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" event={"ID":"19a53cda-4020-471d-a7f3-6e410ae94b65","Type":"ContainerDied","Data":"431b3af6bf3ce2b96e141a6a7a02b1fbd2b696080ef1618004d17403d0402240"} Feb 26 20:24:57 crc kubenswrapper[4722]: I0226 20:24:57.146480 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:24:57 crc kubenswrapper[4722]: E0226 20:24:57.146712 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.031770 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.199666 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhz26\" (UniqueName: \"kubernetes.io/projected/19a53cda-4020-471d-a7f3-6e410ae94b65-kube-api-access-mhz26\") pod \"19a53cda-4020-471d-a7f3-6e410ae94b65\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.199772 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-inventory\") pod \"19a53cda-4020-471d-a7f3-6e410ae94b65\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.199899 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-ssh-key-openstack-edpm-ipam\") pod \"19a53cda-4020-471d-a7f3-6e410ae94b65\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.206427 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19a53cda-4020-471d-a7f3-6e410ae94b65-kube-api-access-mhz26" (OuterVolumeSpecName: "kube-api-access-mhz26") pod "19a53cda-4020-471d-a7f3-6e410ae94b65" (UID: "19a53cda-4020-471d-a7f3-6e410ae94b65"). InnerVolumeSpecName "kube-api-access-mhz26". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.230093 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-inventory" (OuterVolumeSpecName: "inventory") pod "19a53cda-4020-471d-a7f3-6e410ae94b65" (UID: "19a53cda-4020-471d-a7f3-6e410ae94b65"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.230433 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "19a53cda-4020-471d-a7f3-6e410ae94b65" (UID: "19a53cda-4020-471d-a7f3-6e410ae94b65"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.302823 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhz26\" (UniqueName: \"kubernetes.io/projected/19a53cda-4020-471d-a7f3-6e410ae94b65-kube-api-access-mhz26\") on node \"crc\" DevicePath \"\"" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.302870 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.302888 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.500763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" event={"ID":"19a53cda-4020-471d-a7f3-6e410ae94b65","Type":"ContainerDied","Data":"df0e39c43be91b11389d8075ffd9e5ba003ec06f9f86c2b5085666af4603384e"} Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.501115 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df0e39c43be91b11389d8075ffd9e5ba003ec06f9f86c2b5085666af4603384e" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.500913 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.651949 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2"] Feb 26 20:24:58 crc kubenswrapper[4722]: E0226 20:24:58.652453 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37270d6e-59ab-4ed7-872d-629514b0727b" containerName="oc" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.652476 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="37270d6e-59ab-4ed7-872d-629514b0727b" containerName="oc" Feb 26 20:24:58 crc kubenswrapper[4722]: E0226 20:24:58.652489 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19a53cda-4020-471d-a7f3-6e410ae94b65" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.652498 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="19a53cda-4020-471d-a7f3-6e410ae94b65" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.652729 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="19a53cda-4020-471d-a7f3-6e410ae94b65" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.652759 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="37270d6e-59ab-4ed7-872d-629514b0727b" containerName="oc" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.653661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.662957 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.662983 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.663231 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.663542 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.671887 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2"] Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.711404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.711644 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.711702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/37b9e07c-5396-48b5-a8cb-6eab31621fc8-kube-api-access-6fkmx\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.816763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.816907 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.816951 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/37b9e07c-5396-48b5-a8cb-6eab31621fc8-kube-api-access-6fkmx\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.829801 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.829835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.849034 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/37b9e07c-5396-48b5-a8cb-6eab31621fc8-kube-api-access-6fkmx\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:59 crc kubenswrapper[4722]: I0226 20:24:59.028311 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:59 crc kubenswrapper[4722]: I0226 20:24:59.565689 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2"] Feb 26 20:25:00 crc kubenswrapper[4722]: I0226 20:25:00.517976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" event={"ID":"37b9e07c-5396-48b5-a8cb-6eab31621fc8","Type":"ContainerStarted","Data":"e4b9dc7a36bc130b77cd8a57ba661c4181b0a9f5361e48036fe52daf9cd4f7d5"} Feb 26 20:25:00 crc kubenswrapper[4722]: I0226 20:25:00.518530 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" event={"ID":"37b9e07c-5396-48b5-a8cb-6eab31621fc8","Type":"ContainerStarted","Data":"6d725c0ae08a1638a5302969772175eb3ef9fc8ae56fa1f7de1a9895ff371be6"} Feb 26 20:25:00 crc kubenswrapper[4722]: I0226 20:25:00.547465 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" podStartSLOduration=2.071184308 podStartE2EDuration="2.547443598s" podCreationTimestamp="2026-02-26 20:24:58 +0000 UTC" firstStartedPulling="2026-02-26 20:24:59.57218648 +0000 UTC m=+1842.109154404" lastFinishedPulling="2026-02-26 20:25:00.04844577 +0000 UTC m=+1842.585413694" observedRunningTime="2026-02-26 20:25:00.539414753 +0000 UTC m=+1843.076382707" watchObservedRunningTime="2026-02-26 20:25:00.547443598 +0000 UTC m=+1843.084411532" Feb 26 20:25:05 crc kubenswrapper[4722]: I0226 20:25:05.586555 4722 generic.go:334] "Generic (PLEG): container finished" podID="37b9e07c-5396-48b5-a8cb-6eab31621fc8" containerID="e4b9dc7a36bc130b77cd8a57ba661c4181b0a9f5361e48036fe52daf9cd4f7d5" exitCode=0 Feb 26 20:25:05 crc kubenswrapper[4722]: I0226 20:25:05.586657 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" event={"ID":"37b9e07c-5396-48b5-a8cb-6eab31621fc8","Type":"ContainerDied","Data":"e4b9dc7a36bc130b77cd8a57ba661c4181b0a9f5361e48036fe52daf9cd4f7d5"} Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.080827 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.217604 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/37b9e07c-5396-48b5-a8cb-6eab31621fc8-kube-api-access-6fkmx\") pod \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.217733 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-ssh-key-openstack-edpm-ipam\") pod \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.217833 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-inventory\") pod \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.223437 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b9e07c-5396-48b5-a8cb-6eab31621fc8-kube-api-access-6fkmx" (OuterVolumeSpecName: "kube-api-access-6fkmx") pod "37b9e07c-5396-48b5-a8cb-6eab31621fc8" (UID: "37b9e07c-5396-48b5-a8cb-6eab31621fc8"). InnerVolumeSpecName "kube-api-access-6fkmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.247272 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "37b9e07c-5396-48b5-a8cb-6eab31621fc8" (UID: "37b9e07c-5396-48b5-a8cb-6eab31621fc8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.247758 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-inventory" (OuterVolumeSpecName: "inventory") pod "37b9e07c-5396-48b5-a8cb-6eab31621fc8" (UID: "37b9e07c-5396-48b5-a8cb-6eab31621fc8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.325773 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.325811 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/37b9e07c-5396-48b5-a8cb-6eab31621fc8-kube-api-access-6fkmx\") on node \"crc\" DevicePath \"\"" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.325829 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.642308 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" event={"ID":"37b9e07c-5396-48b5-a8cb-6eab31621fc8","Type":"ContainerDied","Data":"6d725c0ae08a1638a5302969772175eb3ef9fc8ae56fa1f7de1a9895ff371be6"} Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.642379 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d725c0ae08a1638a5302969772175eb3ef9fc8ae56fa1f7de1a9895ff371be6" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.642461 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.678629 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2"] Feb 26 20:25:07 crc kubenswrapper[4722]: E0226 20:25:07.679112 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b9e07c-5396-48b5-a8cb-6eab31621fc8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.679135 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b9e07c-5396-48b5-a8cb-6eab31621fc8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.679385 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b9e07c-5396-48b5-a8cb-6eab31621fc8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.680369 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.684713 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.684971 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.685248 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.685777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.693240 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2"] Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.844828 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfs5k\" (UniqueName: \"kubernetes.io/projected/ae283069-3ec3-4960-b66a-b830709cb1ee-kube-api-access-cfs5k\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.844875 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.845049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.946596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.946709 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfs5k\" (UniqueName: \"kubernetes.io/projected/ae283069-3ec3-4960-b66a-b830709cb1ee-kube-api-access-cfs5k\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.946728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.950336 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.966736 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.975647 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfs5k\" (UniqueName: \"kubernetes.io/projected/ae283069-3ec3-4960-b66a-b830709cb1ee-kube-api-access-cfs5k\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:08 crc kubenswrapper[4722]: I0226 20:25:08.001482 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:08 crc kubenswrapper[4722]: I0226 20:25:08.548701 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2"] Feb 26 20:25:08 crc kubenswrapper[4722]: I0226 20:25:08.558731 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:25:08 crc kubenswrapper[4722]: I0226 20:25:08.652801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" event={"ID":"ae283069-3ec3-4960-b66a-b830709cb1ee","Type":"ContainerStarted","Data":"dab76602c9ca2a99c1c67a331ed04c489aa1a063f40b2c5d698b16c6436980b6"} Feb 26 20:25:09 crc kubenswrapper[4722]: I0226 20:25:09.663769 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" event={"ID":"ae283069-3ec3-4960-b66a-b830709cb1ee","Type":"ContainerStarted","Data":"12f741f7cb03bb8e25fecbd7973adbaaafcfc410c9d7e7bca43f0347f922b90e"} Feb 26 20:25:09 crc kubenswrapper[4722]: I0226 20:25:09.685957 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" podStartSLOduration=2.266867517 podStartE2EDuration="2.685938719s" podCreationTimestamp="2026-02-26 20:25:07 +0000 UTC" firstStartedPulling="2026-02-26 20:25:08.558420021 +0000 UTC m=+1851.095387945" lastFinishedPulling="2026-02-26 20:25:08.977491223 +0000 UTC m=+1851.514459147" observedRunningTime="2026-02-26 20:25:09.680258097 +0000 UTC m=+1852.217226041" watchObservedRunningTime="2026-02-26 20:25:09.685938719 +0000 UTC m=+1852.222906643" Feb 26 20:25:10 crc kubenswrapper[4722]: I0226 20:25:10.150769 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:25:10 crc kubenswrapper[4722]: E0226 20:25:10.151263 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:25:23 crc kubenswrapper[4722]: I0226 20:25:23.147123 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:25:23 crc kubenswrapper[4722]: E0226 20:25:23.148122 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:25:25 crc kubenswrapper[4722]: I0226 20:25:25.803346 4722 scope.go:117] "RemoveContainer" containerID="bb592692393f0930d4b3123281dbae19fb33d8273e3cf449cd3e968ed73d4454" Feb 26 20:25:25 crc kubenswrapper[4722]: I0226 20:25:25.837191 4722 scope.go:117] "RemoveContainer" containerID="734339de91bbe566cfffe0c05354b3aba86711ba90cb18f3d5f79f4227b2a8ec" Feb 26 20:25:25 crc kubenswrapper[4722]: I0226 20:25:25.884563 4722 scope.go:117] "RemoveContainer" containerID="623be980e1214808cc0408f41f7691791f486241d01b7de06517e0138a9aa1ed" Feb 26 20:25:27 crc kubenswrapper[4722]: I0226 20:25:27.057603 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-fm2w6"] Feb 26 20:25:27 crc kubenswrapper[4722]: I0226 20:25:27.086614 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1fe4-account-create-update-fch9q"] Feb 26 20:25:27 crc kubenswrapper[4722]: I0226 20:25:27.094973 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hlxtf"] Feb 26 20:25:27 crc kubenswrapper[4722]: I0226 20:25:27.103254 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-fm2w6"] Feb 26 20:25:27 crc kubenswrapper[4722]: I0226 20:25:27.112076 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1fe4-account-create-update-fch9q"] Feb 26 20:25:27 crc kubenswrapper[4722]: I0226 20:25:27.121278 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hlxtf"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.046608 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-051f-account-create-update-5jdk4"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.063716 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-ndnrb"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.075911 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8b92-account-create-update-hxkpb"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.087633 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-051f-account-create-update-5jdk4"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.095882 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-ndnrb"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.104545 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8b92-account-create-update-hxkpb"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.163001 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b676a2-eba1-45dd-accd-84f2c1d0eba6" path="/var/lib/kubelet/pods/37b676a2-eba1-45dd-accd-84f2c1d0eba6/volumes" Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.164558 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef0d022-c81c-489e-91aa-209be0812ce0" path="/var/lib/kubelet/pods/9ef0d022-c81c-489e-91aa-209be0812ce0/volumes" Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.165474 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8f5041-719a-463a-be2b-58da5280e1b9" path="/var/lib/kubelet/pods/ac8f5041-719a-463a-be2b-58da5280e1b9/volumes" Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.166432 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af30249f-96fd-4efc-a9f1-9d571dc0e104" path="/var/lib/kubelet/pods/af30249f-96fd-4efc-a9f1-9d571dc0e104/volumes" Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.167793 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" path="/var/lib/kubelet/pods/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac/volumes" Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.168345 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe5cc671-e3c0-4b89-a2db-be576bf17d80" path="/var/lib/kubelet/pods/fe5cc671-e3c0-4b89-a2db-be576bf17d80/volumes" Feb 26 20:25:35 crc kubenswrapper[4722]: I0226 20:25:35.145687 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:25:35 crc kubenswrapper[4722]: I0226 20:25:35.896561 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"ba00d4572838bf5170760d7a148718dc7d189ec6d3ccd3ff8ee8b29b1ba11ce4"} Feb 26 20:25:48 crc kubenswrapper[4722]: I0226 20:25:48.003370 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae283069-3ec3-4960-b66a-b830709cb1ee" containerID="12f741f7cb03bb8e25fecbd7973adbaaafcfc410c9d7e7bca43f0347f922b90e" exitCode=0 Feb 26 20:25:48 crc kubenswrapper[4722]: I0226 20:25:48.003558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" event={"ID":"ae283069-3ec3-4960-b66a-b830709cb1ee","Type":"ContainerDied","Data":"12f741f7cb03bb8e25fecbd7973adbaaafcfc410c9d7e7bca43f0347f922b90e"} Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.592884 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.730918 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-inventory\") pod \"ae283069-3ec3-4960-b66a-b830709cb1ee\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.731206 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-ssh-key-openstack-edpm-ipam\") pod \"ae283069-3ec3-4960-b66a-b830709cb1ee\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.731317 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfs5k\" (UniqueName: \"kubernetes.io/projected/ae283069-3ec3-4960-b66a-b830709cb1ee-kube-api-access-cfs5k\") pod \"ae283069-3ec3-4960-b66a-b830709cb1ee\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.737237 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae283069-3ec3-4960-b66a-b830709cb1ee-kube-api-access-cfs5k" (OuterVolumeSpecName: "kube-api-access-cfs5k") pod "ae283069-3ec3-4960-b66a-b830709cb1ee" (UID: "ae283069-3ec3-4960-b66a-b830709cb1ee"). InnerVolumeSpecName "kube-api-access-cfs5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.761171 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-inventory" (OuterVolumeSpecName: "inventory") pod "ae283069-3ec3-4960-b66a-b830709cb1ee" (UID: "ae283069-3ec3-4960-b66a-b830709cb1ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.761370 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ae283069-3ec3-4960-b66a-b830709cb1ee" (UID: "ae283069-3ec3-4960-b66a-b830709cb1ee"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.836901 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.836997 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfs5k\" (UniqueName: \"kubernetes.io/projected/ae283069-3ec3-4960-b66a-b830709cb1ee-kube-api-access-cfs5k\") on node \"crc\" DevicePath \"\"" Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.837105 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.022907 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" event={"ID":"ae283069-3ec3-4960-b66a-b830709cb1ee","Type":"ContainerDied","Data":"dab76602c9ca2a99c1c67a331ed04c489aa1a063f40b2c5d698b16c6436980b6"} Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.022958 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dab76602c9ca2a99c1c67a331ed04c489aa1a063f40b2c5d698b16c6436980b6" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.023002 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.121010 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v"] Feb 26 20:25:50 crc kubenswrapper[4722]: E0226 20:25:50.121492 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae283069-3ec3-4960-b66a-b830709cb1ee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.121515 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae283069-3ec3-4960-b66a-b830709cb1ee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.121716 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae283069-3ec3-4960-b66a-b830709cb1ee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.122443 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.124768 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.124867 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.125318 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.130023 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.140015 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v"] Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.169578 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.169724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdtc9\" (UniqueName: \"kubernetes.io/projected/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-kube-api-access-xdtc9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.169827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.271509 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.271833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdtc9\" (UniqueName: \"kubernetes.io/projected/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-kube-api-access-xdtc9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.271870 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.280683 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.290708 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.291087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdtc9\" (UniqueName: \"kubernetes.io/projected/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-kube-api-access-xdtc9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.479537 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:51 crc kubenswrapper[4722]: I0226 20:25:51.075992 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v"] Feb 26 20:25:52 crc kubenswrapper[4722]: I0226 20:25:52.066373 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" event={"ID":"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f","Type":"ContainerStarted","Data":"31775f9494a356278017923066470b0739f70f67ec74901f996ea3aaeb1f7dcf"} Feb 26 20:25:52 crc kubenswrapper[4722]: I0226 20:25:52.067115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" event={"ID":"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f","Type":"ContainerStarted","Data":"b3547b4fcec2f2b9e24a6351605ed5e93645a874d564ecdcd2b0b415eb616efb"} Feb 26 20:25:52 crc kubenswrapper[4722]: I0226 20:25:52.100782 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" podStartSLOduration=1.5938675679999998 podStartE2EDuration="2.100763346s" podCreationTimestamp="2026-02-26 20:25:50 +0000 UTC" firstStartedPulling="2026-02-26 20:25:51.068853075 +0000 UTC m=+1893.605820999" lastFinishedPulling="2026-02-26 20:25:51.575748853 +0000 UTC m=+1894.112716777" observedRunningTime="2026-02-26 20:25:52.089657559 +0000 UTC m=+1894.626625493" watchObservedRunningTime="2026-02-26 20:25:52.100763346 +0000 UTC m=+1894.637731280" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.137301 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535626-pxhv7"] Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.145940 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.153723 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.153844 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.154276 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.179262 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535626-pxhv7"] Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.284017 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbsrw\" (UniqueName: \"kubernetes.io/projected/89b25625-2a04-40bd-b7db-f6fa3b1fc25f-kube-api-access-kbsrw\") pod \"auto-csr-approver-29535626-pxhv7\" (UID: \"89b25625-2a04-40bd-b7db-f6fa3b1fc25f\") " pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.385659 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbsrw\" (UniqueName: \"kubernetes.io/projected/89b25625-2a04-40bd-b7db-f6fa3b1fc25f-kube-api-access-kbsrw\") pod \"auto-csr-approver-29535626-pxhv7\" (UID: \"89b25625-2a04-40bd-b7db-f6fa3b1fc25f\") " pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.404218 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbsrw\" (UniqueName: \"kubernetes.io/projected/89b25625-2a04-40bd-b7db-f6fa3b1fc25f-kube-api-access-kbsrw\") pod \"auto-csr-approver-29535626-pxhv7\" (UID: \"89b25625-2a04-40bd-b7db-f6fa3b1fc25f\") " pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.485377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.941246 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535626-pxhv7"] Feb 26 20:26:01 crc kubenswrapper[4722]: I0226 20:26:01.038539 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gbmnp"] Feb 26 20:26:01 crc kubenswrapper[4722]: I0226 20:26:01.053070 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gbmnp"] Feb 26 20:26:01 crc kubenswrapper[4722]: I0226 20:26:01.181112 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" event={"ID":"89b25625-2a04-40bd-b7db-f6fa3b1fc25f","Type":"ContainerStarted","Data":"9c76c1d4243bd0a8d34a91c610e20a38791174d8ddb2f2a048deed00f6300729"} Feb 26 20:26:02 crc kubenswrapper[4722]: I0226 20:26:02.158193 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e863110f-e026-4433-8992-8ed0ae33521a" path="/var/lib/kubelet/pods/e863110f-e026-4433-8992-8ed0ae33521a/volumes" Feb 26 20:26:02 crc kubenswrapper[4722]: I0226 20:26:02.196913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" event={"ID":"89b25625-2a04-40bd-b7db-f6fa3b1fc25f","Type":"ContainerStarted","Data":"1346d1fa251cda83e5d2662800a177ca6e9b4d25494bd8493c12fad71cc6d0b7"} Feb 26 20:26:02 crc kubenswrapper[4722]: I0226 20:26:02.217452 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" podStartSLOduration=1.406141998 podStartE2EDuration="2.217433983s" podCreationTimestamp="2026-02-26 20:26:00 +0000 UTC" firstStartedPulling="2026-02-26 20:26:00.94379839 +0000 UTC m=+1903.480766314" lastFinishedPulling="2026-02-26 20:26:01.755090355 +0000 UTC m=+1904.292058299" observedRunningTime="2026-02-26 20:26:02.211685889 +0000 UTC m=+1904.748653813" watchObservedRunningTime="2026-02-26 20:26:02.217433983 +0000 UTC m=+1904.754401907" Feb 26 20:26:03 crc kubenswrapper[4722]: I0226 20:26:03.206945 4722 generic.go:334] "Generic (PLEG): container finished" podID="89b25625-2a04-40bd-b7db-f6fa3b1fc25f" containerID="1346d1fa251cda83e5d2662800a177ca6e9b4d25494bd8493c12fad71cc6d0b7" exitCode=0 Feb 26 20:26:03 crc kubenswrapper[4722]: I0226 20:26:03.207084 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" event={"ID":"89b25625-2a04-40bd-b7db-f6fa3b1fc25f","Type":"ContainerDied","Data":"1346d1fa251cda83e5d2662800a177ca6e9b4d25494bd8493c12fad71cc6d0b7"} Feb 26 20:26:04 crc kubenswrapper[4722]: I0226 20:26:04.639518 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:04 crc kubenswrapper[4722]: I0226 20:26:04.776426 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbsrw\" (UniqueName: \"kubernetes.io/projected/89b25625-2a04-40bd-b7db-f6fa3b1fc25f-kube-api-access-kbsrw\") pod \"89b25625-2a04-40bd-b7db-f6fa3b1fc25f\" (UID: \"89b25625-2a04-40bd-b7db-f6fa3b1fc25f\") " Feb 26 20:26:04 crc kubenswrapper[4722]: I0226 20:26:04.782815 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b25625-2a04-40bd-b7db-f6fa3b1fc25f-kube-api-access-kbsrw" (OuterVolumeSpecName: "kube-api-access-kbsrw") pod "89b25625-2a04-40bd-b7db-f6fa3b1fc25f" (UID: "89b25625-2a04-40bd-b7db-f6fa3b1fc25f"). InnerVolumeSpecName "kube-api-access-kbsrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:26:04 crc kubenswrapper[4722]: I0226 20:26:04.879124 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbsrw\" (UniqueName: \"kubernetes.io/projected/89b25625-2a04-40bd-b7db-f6fa3b1fc25f-kube-api-access-kbsrw\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:05 crc kubenswrapper[4722]: I0226 20:26:05.227826 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:05 crc kubenswrapper[4722]: I0226 20:26:05.227736 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" event={"ID":"89b25625-2a04-40bd-b7db-f6fa3b1fc25f","Type":"ContainerDied","Data":"9c76c1d4243bd0a8d34a91c610e20a38791174d8ddb2f2a048deed00f6300729"} Feb 26 20:26:05 crc kubenswrapper[4722]: I0226 20:26:05.234311 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c76c1d4243bd0a8d34a91c610e20a38791174d8ddb2f2a048deed00f6300729" Feb 26 20:26:05 crc kubenswrapper[4722]: I0226 20:26:05.276992 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535620-cgl4r"] Feb 26 20:26:05 crc kubenswrapper[4722]: I0226 20:26:05.285423 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535620-cgl4r"] Feb 26 20:26:06 crc kubenswrapper[4722]: I0226 20:26:06.159324 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c709bd-8242-4d15-b343-b6e07c3cb44c" path="/var/lib/kubelet/pods/10c709bd-8242-4d15-b343-b6e07c3cb44c/volumes" Feb 26 20:26:24 crc kubenswrapper[4722]: I0226 20:26:24.035080 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rvgw9"] Feb 26 20:26:24 crc kubenswrapper[4722]: I0226 20:26:24.048784 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rvgw9"] Feb 26 20:26:24 crc kubenswrapper[4722]: I0226 20:26:24.161291 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ac107a-489c-4551-a4ed-49cd15006d82" path="/var/lib/kubelet/pods/85ac107a-489c-4551-a4ed-49cd15006d82/volumes" Feb 26 20:26:25 crc kubenswrapper[4722]: I0226 20:26:25.031314 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjxc5"] Feb 26 20:26:25 crc kubenswrapper[4722]: I0226 20:26:25.042925 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjxc5"] Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.020500 4722 scope.go:117] "RemoveContainer" containerID="a40c11587c55ff87865da5c5fd2011c57738196a56ea15331c61f9c3ecb1e29d" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.093194 4722 scope.go:117] "RemoveContainer" containerID="95f2ba448ff4845c41ed4591656eae80b72bbc42527cbc23fff03dbb497fffec" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.134085 4722 scope.go:117] "RemoveContainer" containerID="49960b919d29d8cfc6fb95130f19cf2558ac6230e20d1ce56374f8bd1a80ccca" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.169124 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19cd0379-1ef6-4db2-b900-2ca9efaf0452" path="/var/lib/kubelet/pods/19cd0379-1ef6-4db2-b900-2ca9efaf0452/volumes" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.178958 4722 scope.go:117] "RemoveContainer" containerID="0a5a814b45dd1516dc3cbde82fadf29bbfb0668d97c930f4ecbd4108971b772a" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.241728 4722 scope.go:117] "RemoveContainer" containerID="623461d24044b6490c318555def5090a02940373d46d385c0200955da356d6ee" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.300665 4722 scope.go:117] "RemoveContainer" containerID="d828df4164de6ac089e32225dc26397da48a4df66dd12f3a5de850c019258968" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.349899 4722 scope.go:117] "RemoveContainer" containerID="741050786bb3d29947da2bc78a8be1e7b66276aeb94e7449d6dc83ed51875a07" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.399290 4722 scope.go:117] "RemoveContainer" containerID="30054e203c57524d8b5cff442429e6ee7df49e239a7b95844ec3c000b889b494" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.420470 4722 scope.go:117] "RemoveContainer" containerID="c029f28011800ff3d69c1f127442300f8dcdfd75b3e8d05cecb50a22759ad803" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.442625 4722 scope.go:117] "RemoveContainer" containerID="afeb3c6c9d4df7a35b2c56ba06902433218933267ef411f3e596c6aee9e216c3" Feb 26 20:26:38 crc kubenswrapper[4722]: I0226 20:26:38.585628 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" containerID="31775f9494a356278017923066470b0739f70f67ec74901f996ea3aaeb1f7dcf" exitCode=0 Feb 26 20:26:38 crc kubenswrapper[4722]: I0226 20:26:38.586219 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" event={"ID":"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f","Type":"ContainerDied","Data":"31775f9494a356278017923066470b0739f70f67ec74901f996ea3aaeb1f7dcf"} Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.135151 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.218240 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdtc9\" (UniqueName: \"kubernetes.io/projected/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-kube-api-access-xdtc9\") pod \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.218300 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-inventory\") pod \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.218453 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-ssh-key-openstack-edpm-ipam\") pod \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.224004 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-kube-api-access-xdtc9" (OuterVolumeSpecName: "kube-api-access-xdtc9") pod "4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" (UID: "4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f"). InnerVolumeSpecName "kube-api-access-xdtc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.252579 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" (UID: "4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.255630 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-inventory" (OuterVolumeSpecName: "inventory") pod "4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" (UID: "4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.322177 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.322226 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdtc9\" (UniqueName: \"kubernetes.io/projected/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-kube-api-access-xdtc9\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.322239 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.607091 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" event={"ID":"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f","Type":"ContainerDied","Data":"b3547b4fcec2f2b9e24a6351605ed5e93645a874d564ecdcd2b0b415eb616efb"} Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.607517 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3547b4fcec2f2b9e24a6351605ed5e93645a874d564ecdcd2b0b415eb616efb" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.607165 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.703015 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r8rtz"] Feb 26 20:26:40 crc kubenswrapper[4722]: E0226 20:26:40.704998 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b25625-2a04-40bd-b7db-f6fa3b1fc25f" containerName="oc" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.705021 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b25625-2a04-40bd-b7db-f6fa3b1fc25f" containerName="oc" Feb 26 20:26:40 crc kubenswrapper[4722]: E0226 20:26:40.705042 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.705049 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.705262 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b25625-2a04-40bd-b7db-f6fa3b1fc25f" containerName="oc" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.705281 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.705970 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.708251 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.708396 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.708668 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.710668 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.716980 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r8rtz"] Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.833258 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.833567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.833726 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9bqt\" (UniqueName: \"kubernetes.io/projected/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-kube-api-access-l9bqt\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.936122 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.936461 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.936591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9bqt\" (UniqueName: \"kubernetes.io/projected/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-kube-api-access-l9bqt\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.940758 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.940772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.953663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9bqt\" (UniqueName: \"kubernetes.io/projected/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-kube-api-access-l9bqt\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:41 crc kubenswrapper[4722]: I0226 20:26:41.026042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:41 crc kubenswrapper[4722]: I0226 20:26:41.569558 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r8rtz"] Feb 26 20:26:41 crc kubenswrapper[4722]: I0226 20:26:41.618293 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" event={"ID":"5b58da6a-b54c-41f9-a1fc-49021ec39a2c","Type":"ContainerStarted","Data":"5b87021740b073f838a6eae1fecbebe212db027d42459f922eba6737855765e1"} Feb 26 20:26:42 crc kubenswrapper[4722]: I0226 20:26:42.627476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" event={"ID":"5b58da6a-b54c-41f9-a1fc-49021ec39a2c","Type":"ContainerStarted","Data":"3b55a669f314309530acf67e3cc8b2e388f6bae58a12637b2e026f88e6732500"} Feb 26 20:26:42 crc kubenswrapper[4722]: I0226 20:26:42.642642 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" podStartSLOduration=2.168519927 podStartE2EDuration="2.64261787s" podCreationTimestamp="2026-02-26 20:26:40 +0000 UTC" firstStartedPulling="2026-02-26 20:26:41.573034011 +0000 UTC m=+1944.110001935" lastFinishedPulling="2026-02-26 20:26:42.047131954 +0000 UTC m=+1944.584099878" observedRunningTime="2026-02-26 20:26:42.641069159 +0000 UTC m=+1945.178037093" watchObservedRunningTime="2026-02-26 20:26:42.64261787 +0000 UTC m=+1945.179585804" Feb 26 20:26:49 crc kubenswrapper[4722]: I0226 20:26:49.695446 4722 generic.go:334] "Generic (PLEG): container finished" podID="5b58da6a-b54c-41f9-a1fc-49021ec39a2c" containerID="3b55a669f314309530acf67e3cc8b2e388f6bae58a12637b2e026f88e6732500" exitCode=0 Feb 26 20:26:49 crc kubenswrapper[4722]: I0226 20:26:49.695990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" event={"ID":"5b58da6a-b54c-41f9-a1fc-49021ec39a2c","Type":"ContainerDied","Data":"3b55a669f314309530acf67e3cc8b2e388f6bae58a12637b2e026f88e6732500"} Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.210383 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.278862 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-ssh-key-openstack-edpm-ipam\") pod \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.279192 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-inventory-0\") pod \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.279337 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9bqt\" (UniqueName: \"kubernetes.io/projected/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-kube-api-access-l9bqt\") pod \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.285791 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-kube-api-access-l9bqt" (OuterVolumeSpecName: "kube-api-access-l9bqt") pod "5b58da6a-b54c-41f9-a1fc-49021ec39a2c" (UID: "5b58da6a-b54c-41f9-a1fc-49021ec39a2c"). InnerVolumeSpecName "kube-api-access-l9bqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.314164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5b58da6a-b54c-41f9-a1fc-49021ec39a2c" (UID: "5b58da6a-b54c-41f9-a1fc-49021ec39a2c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.319235 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "5b58da6a-b54c-41f9-a1fc-49021ec39a2c" (UID: "5b58da6a-b54c-41f9-a1fc-49021ec39a2c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.382542 4722 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.382576 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9bqt\" (UniqueName: \"kubernetes.io/projected/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-kube-api-access-l9bqt\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.382588 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.717448 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" event={"ID":"5b58da6a-b54c-41f9-a1fc-49021ec39a2c","Type":"ContainerDied","Data":"5b87021740b073f838a6eae1fecbebe212db027d42459f922eba6737855765e1"} Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.718019 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b87021740b073f838a6eae1fecbebe212db027d42459f922eba6737855765e1" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.717499 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.965705 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4"] Feb 26 20:26:51 crc kubenswrapper[4722]: E0226 20:26:51.966315 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b58da6a-b54c-41f9-a1fc-49021ec39a2c" containerName="ssh-known-hosts-edpm-deployment" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.966336 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b58da6a-b54c-41f9-a1fc-49021ec39a2c" containerName="ssh-known-hosts-edpm-deployment" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.966553 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b58da6a-b54c-41f9-a1fc-49021ec39a2c" containerName="ssh-known-hosts-edpm-deployment" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.967461 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.972416 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.972437 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.972551 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.972676 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.978724 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4"] Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.094764 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.094807 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b799w\" (UniqueName: \"kubernetes.io/projected/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-kube-api-access-b799w\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.094909 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.197765 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b799w\" (UniqueName: \"kubernetes.io/projected/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-kube-api-access-b799w\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.197828 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.197984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.207214 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.215297 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.219229 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b799w\" (UniqueName: \"kubernetes.io/projected/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-kube-api-access-b799w\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.299698 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.835893 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4"] Feb 26 20:26:52 crc kubenswrapper[4722]: W0226 20:26:52.845719 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f7a8d95_7d72_427d_8bd1_f0ec3e512458.slice/crio-9e6389559d973356ebe41c5d04d12db4614d8d33438cd97d8a72fb98985e34c9 WatchSource:0}: Error finding container 9e6389559d973356ebe41c5d04d12db4614d8d33438cd97d8a72fb98985e34c9: Status 404 returned error can't find the container with id 9e6389559d973356ebe41c5d04d12db4614d8d33438cd97d8a72fb98985e34c9 Feb 26 20:26:53 crc kubenswrapper[4722]: I0226 20:26:53.748592 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" event={"ID":"1f7a8d95-7d72-427d-8bd1-f0ec3e512458","Type":"ContainerStarted","Data":"d583cd3f3906177a3b7802e096664876af622f9471709dc05ffd711608c74812"} Feb 26 20:26:53 crc kubenswrapper[4722]: I0226 20:26:53.748950 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" event={"ID":"1f7a8d95-7d72-427d-8bd1-f0ec3e512458","Type":"ContainerStarted","Data":"9e6389559d973356ebe41c5d04d12db4614d8d33438cd97d8a72fb98985e34c9"} Feb 26 20:26:53 crc kubenswrapper[4722]: I0226 20:26:53.779244 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" podStartSLOduration=2.342369561 podStartE2EDuration="2.779218327s" podCreationTimestamp="2026-02-26 20:26:51 +0000 UTC" firstStartedPulling="2026-02-26 20:26:52.848652115 +0000 UTC m=+1955.385620039" lastFinishedPulling="2026-02-26 20:26:53.285500881 +0000 UTC m=+1955.822468805" observedRunningTime="2026-02-26 20:26:53.764549676 +0000 UTC m=+1956.301517600" watchObservedRunningTime="2026-02-26 20:26:53.779218327 +0000 UTC m=+1956.316186281" Feb 26 20:27:02 crc kubenswrapper[4722]: I0226 20:27:02.852580 4722 generic.go:334] "Generic (PLEG): container finished" podID="1f7a8d95-7d72-427d-8bd1-f0ec3e512458" containerID="d583cd3f3906177a3b7802e096664876af622f9471709dc05ffd711608c74812" exitCode=0 Feb 26 20:27:02 crc kubenswrapper[4722]: I0226 20:27:02.852677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" event={"ID":"1f7a8d95-7d72-427d-8bd1-f0ec3e512458","Type":"ContainerDied","Data":"d583cd3f3906177a3b7802e096664876af622f9471709dc05ffd711608c74812"} Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.393737 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.494351 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-inventory\") pod \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.494506 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-ssh-key-openstack-edpm-ipam\") pod \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.494714 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b799w\" (UniqueName: \"kubernetes.io/projected/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-kube-api-access-b799w\") pod \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.500680 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-kube-api-access-b799w" (OuterVolumeSpecName: "kube-api-access-b799w") pod "1f7a8d95-7d72-427d-8bd1-f0ec3e512458" (UID: "1f7a8d95-7d72-427d-8bd1-f0ec3e512458"). InnerVolumeSpecName "kube-api-access-b799w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.523128 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1f7a8d95-7d72-427d-8bd1-f0ec3e512458" (UID: "1f7a8d95-7d72-427d-8bd1-f0ec3e512458"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.533238 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-inventory" (OuterVolumeSpecName: "inventory") pod "1f7a8d95-7d72-427d-8bd1-f0ec3e512458" (UID: "1f7a8d95-7d72-427d-8bd1-f0ec3e512458"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.596925 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b799w\" (UniqueName: \"kubernetes.io/projected/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-kube-api-access-b799w\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.597233 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.597314 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.874911 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" event={"ID":"1f7a8d95-7d72-427d-8bd1-f0ec3e512458","Type":"ContainerDied","Data":"9e6389559d973356ebe41c5d04d12db4614d8d33438cd97d8a72fb98985e34c9"} Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.874951 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e6389559d973356ebe41c5d04d12db4614d8d33438cd97d8a72fb98985e34c9" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.874999 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.949626 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f"] Feb 26 20:27:04 crc kubenswrapper[4722]: E0226 20:27:04.950241 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7a8d95-7d72-427d-8bd1-f0ec3e512458" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.950267 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7a8d95-7d72-427d-8bd1-f0ec3e512458" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.950529 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7a8d95-7d72-427d-8bd1-f0ec3e512458" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.951508 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.954517 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.954850 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.955033 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.955196 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.962791 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f"] Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.107432 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.107859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-kube-api-access-wh97n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.107885 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.209448 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.209620 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-kube-api-access-wh97n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.209652 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.215550 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.218711 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.235885 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-kube-api-access-wh97n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.291630 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.848079 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f"] Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.885969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" event={"ID":"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb","Type":"ContainerStarted","Data":"6b94d67eb1c57f6e9284089ca351717b95f5122b179a3999b270d21d8b7ab73c"} Feb 26 20:27:06 crc kubenswrapper[4722]: I0226 20:27:06.896180 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" event={"ID":"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb","Type":"ContainerStarted","Data":"95a7e0b48a4275056ee8d3acb5d52b62dc5107cfe952fd103bc0dd83f7bb36d0"} Feb 26 20:27:06 crc kubenswrapper[4722]: I0226 20:27:06.924094 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" podStartSLOduration=2.541786508 podStartE2EDuration="2.924077323s" podCreationTimestamp="2026-02-26 20:27:04 +0000 UTC" firstStartedPulling="2026-02-26 20:27:05.871043238 +0000 UTC m=+1968.408011162" lastFinishedPulling="2026-02-26 20:27:06.253334053 +0000 UTC m=+1968.790301977" observedRunningTime="2026-02-26 20:27:06.917037721 +0000 UTC m=+1969.454005645" watchObservedRunningTime="2026-02-26 20:27:06.924077323 +0000 UTC m=+1969.461045247" Feb 26 20:27:09 crc kubenswrapper[4722]: I0226 20:27:09.064013 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-2m4wz"] Feb 26 20:27:09 crc kubenswrapper[4722]: I0226 20:27:09.078405 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-2m4wz"] Feb 26 20:27:10 crc kubenswrapper[4722]: I0226 20:27:10.156957 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3d3547-11a7-4e10-b57a-a057d2c60e70" path="/var/lib/kubelet/pods/8b3d3547-11a7-4e10-b57a-a057d2c60e70/volumes" Feb 26 20:27:15 crc kubenswrapper[4722]: I0226 20:27:15.979090 4722 generic.go:334] "Generic (PLEG): container finished" podID="4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" containerID="95a7e0b48a4275056ee8d3acb5d52b62dc5107cfe952fd103bc0dd83f7bb36d0" exitCode=0 Feb 26 20:27:15 crc kubenswrapper[4722]: I0226 20:27:15.979201 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" event={"ID":"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb","Type":"ContainerDied","Data":"95a7e0b48a4275056ee8d3acb5d52b62dc5107cfe952fd103bc0dd83f7bb36d0"} Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.467866 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.568952 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-inventory\") pod \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.569148 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-kube-api-access-wh97n\") pod \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.569308 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-ssh-key-openstack-edpm-ipam\") pod \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.573912 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-kube-api-access-wh97n" (OuterVolumeSpecName: "kube-api-access-wh97n") pod "4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" (UID: "4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb"). InnerVolumeSpecName "kube-api-access-wh97n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.597215 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-inventory" (OuterVolumeSpecName: "inventory") pod "4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" (UID: "4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.602697 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" (UID: "4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.671661 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-kube-api-access-wh97n\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.671703 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.671721 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.997293 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" event={"ID":"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb","Type":"ContainerDied","Data":"6b94d67eb1c57f6e9284089ca351717b95f5122b179a3999b270d21d8b7ab73c"} Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.997339 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b94d67eb1c57f6e9284089ca351717b95f5122b179a3999b270d21d8b7ab73c" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.997345 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.087927 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx"] Feb 26 20:27:18 crc kubenswrapper[4722]: E0226 20:27:18.088416 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.088440 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.088677 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.089396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.091688 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.094687 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.095005 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.095218 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.095394 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.095514 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.096000 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.096114 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.103958 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx"] Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.180909 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.180955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181159 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181227 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181338 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181588 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6tsn\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-kube-api-access-g6tsn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181639 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181670 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181743 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181775 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181855 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181878 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283339 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6tsn\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-kube-api-access-g6tsn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283398 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283424 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283469 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283494 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283528 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283575 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283625 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283644 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283749 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.287650 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.287777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.287931 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.288176 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.288374 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.288446 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.289119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.291131 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.291900 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.292208 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.293630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.295458 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.298020 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.298683 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.298717 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.298717 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.299052 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.299261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.299321 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.299845 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6tsn\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-kube-api-access-g6tsn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.409888 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.419010 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.948261 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx"] Feb 26 20:27:19 crc kubenswrapper[4722]: I0226 20:27:19.007007 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" event={"ID":"77f3d316-1f72-4a5a-b730-7f8dab299ca8","Type":"ContainerStarted","Data":"944b26a1295097b53c017fc416aacf00e8ac508d6c0a94d1296e3fe4deb15200"} Feb 26 20:27:19 crc kubenswrapper[4722]: I0226 20:27:19.395634 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:27:20 crc kubenswrapper[4722]: I0226 20:27:20.017263 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" event={"ID":"77f3d316-1f72-4a5a-b730-7f8dab299ca8","Type":"ContainerStarted","Data":"4bf5201a282b66de1e7530605bf8581ab7f33dbc1d6713d78ca09ac60ab9561b"} Feb 26 20:27:20 crc kubenswrapper[4722]: I0226 20:27:20.047658 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" podStartSLOduration=1.621351239 podStartE2EDuration="2.047632141s" podCreationTimestamp="2026-02-26 20:27:18 +0000 UTC" firstStartedPulling="2026-02-26 20:27:18.962402401 +0000 UTC m=+1981.499370345" lastFinishedPulling="2026-02-26 20:27:19.388683303 +0000 UTC m=+1981.925651247" observedRunningTime="2026-02-26 20:27:20.034828684 +0000 UTC m=+1982.571796608" watchObservedRunningTime="2026-02-26 20:27:20.047632141 +0000 UTC m=+1982.584600065" Feb 26 20:27:26 crc kubenswrapper[4722]: I0226 20:27:26.681511 4722 scope.go:117] "RemoveContainer" containerID="c98d7f7d7eb20e44d64016a7dbe95dfe4ce7d86b2359527aa666431b5045009e" Feb 26 20:27:53 crc kubenswrapper[4722]: I0226 20:27:53.040434 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-vfgst"] Feb 26 20:27:53 crc kubenswrapper[4722]: I0226 20:27:53.050077 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-vfgst"] Feb 26 20:27:53 crc kubenswrapper[4722]: I0226 20:27:53.487592 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:27:53 crc kubenswrapper[4722]: I0226 20:27:53.487921 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:27:54 crc kubenswrapper[4722]: I0226 20:27:54.158816 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f27e7d78-b723-43b0-8734-8892bd8cfd3b" path="/var/lib/kubelet/pods/f27e7d78-b723-43b0-8734-8892bd8cfd3b/volumes" Feb 26 20:27:54 crc kubenswrapper[4722]: I0226 20:27:54.348444 4722 generic.go:334] "Generic (PLEG): container finished" podID="77f3d316-1f72-4a5a-b730-7f8dab299ca8" containerID="4bf5201a282b66de1e7530605bf8581ab7f33dbc1d6713d78ca09ac60ab9561b" exitCode=0 Feb 26 20:27:54 crc kubenswrapper[4722]: I0226 20:27:54.348484 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" event={"ID":"77f3d316-1f72-4a5a-b730-7f8dab299ca8","Type":"ContainerDied","Data":"4bf5201a282b66de1e7530605bf8581ab7f33dbc1d6713d78ca09ac60ab9561b"} Feb 26 20:27:55 crc kubenswrapper[4722]: I0226 20:27:55.884459 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-bootstrap-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070311 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-repo-setup-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070374 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6tsn\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-kube-api-access-g6tsn\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070408 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-nova-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070447 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ovn-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-inventory\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070528 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-neutron-metadata-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070582 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-libvirt-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070610 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070647 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070690 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-telemetry-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070715 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ssh-key-openstack-edpm-ipam\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070748 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070779 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.079190 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.079236 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.079266 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.079329 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.079492 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.079930 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-kube-api-access-g6tsn" (OuterVolumeSpecName: "kube-api-access-g6tsn") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "kube-api-access-g6tsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.080239 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.080297 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.080462 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.080658 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.083363 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.095891 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.108714 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-inventory" (OuterVolumeSpecName: "inventory") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.109269 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173319 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173353 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173369 4722 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173379 4722 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173388 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6tsn\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-kube-api-access-g6tsn\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173397 4722 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173407 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173416 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173423 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173432 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173441 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173450 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173460 4722 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173492 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.370368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" event={"ID":"77f3d316-1f72-4a5a-b730-7f8dab299ca8","Type":"ContainerDied","Data":"944b26a1295097b53c017fc416aacf00e8ac508d6c0a94d1296e3fe4deb15200"} Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.370827 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="944b26a1295097b53c017fc416aacf00e8ac508d6c0a94d1296e3fe4deb15200" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.370431 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.549237 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r"] Feb 26 20:27:56 crc kubenswrapper[4722]: E0226 20:27:56.549650 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f3d316-1f72-4a5a-b730-7f8dab299ca8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.549687 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f3d316-1f72-4a5a-b730-7f8dab299ca8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.549902 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="77f3d316-1f72-4a5a-b730-7f8dab299ca8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.550675 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.558380 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.558493 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.558500 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.558557 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.558638 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.574059 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r"] Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.682229 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0266eb0-8a26-4701-9014-93e0f03724ab-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.682272 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.682295 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtzpt\" (UniqueName: \"kubernetes.io/projected/a0266eb0-8a26-4701-9014-93e0f03724ab-kube-api-access-gtzpt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.682313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.682893 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.784579 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.784727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0266eb0-8a26-4701-9014-93e0f03724ab-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.784759 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.784791 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtzpt\" (UniqueName: \"kubernetes.io/projected/a0266eb0-8a26-4701-9014-93e0f03724ab-kube-api-access-gtzpt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.784819 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.786067 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0266eb0-8a26-4701-9014-93e0f03724ab-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.790009 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.790948 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.793049 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.801581 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtzpt\" (UniqueName: \"kubernetes.io/projected/a0266eb0-8a26-4701-9014-93e0f03724ab-kube-api-access-gtzpt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.868362 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:57 crc kubenswrapper[4722]: I0226 20:27:57.465358 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r"] Feb 26 20:27:58 crc kubenswrapper[4722]: I0226 20:27:58.394030 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" event={"ID":"a0266eb0-8a26-4701-9014-93e0f03724ab","Type":"ContainerStarted","Data":"62a6338929daca235ed7c1aaaf72d91656ce5583f2ff560eb27ea1e123067631"} Feb 26 20:27:58 crc kubenswrapper[4722]: I0226 20:27:58.394499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" event={"ID":"a0266eb0-8a26-4701-9014-93e0f03724ab","Type":"ContainerStarted","Data":"9fab02decd236c67dfdd4b2188f4c87033a27f609ea3e2bd0d343f933286918b"} Feb 26 20:27:58 crc kubenswrapper[4722]: I0226 20:27:58.419333 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" podStartSLOduration=2.025653408 podStartE2EDuration="2.419305149s" podCreationTimestamp="2026-02-26 20:27:56 +0000 UTC" firstStartedPulling="2026-02-26 20:27:57.46686879 +0000 UTC m=+2020.003836714" lastFinishedPulling="2026-02-26 20:27:57.860520541 +0000 UTC m=+2020.397488455" observedRunningTime="2026-02-26 20:27:58.410208183 +0000 UTC m=+2020.947176107" watchObservedRunningTime="2026-02-26 20:27:58.419305149 +0000 UTC m=+2020.956273073" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.025828 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-g6wlr"] Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.036596 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-g6wlr"] Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.132912 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535628-vgvfh"] Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.134548 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.136668 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.137198 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.137335 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.159982 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" path="/var/lib/kubelet/pods/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6/volumes" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.160855 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535628-vgvfh"] Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.182028 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26ts4\" (UniqueName: \"kubernetes.io/projected/bc1f2f35-9607-4719-993b-8678440d3a0b-kube-api-access-26ts4\") pod \"auto-csr-approver-29535628-vgvfh\" (UID: \"bc1f2f35-9607-4719-993b-8678440d3a0b\") " pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.284645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26ts4\" (UniqueName: \"kubernetes.io/projected/bc1f2f35-9607-4719-993b-8678440d3a0b-kube-api-access-26ts4\") pod \"auto-csr-approver-29535628-vgvfh\" (UID: \"bc1f2f35-9607-4719-993b-8678440d3a0b\") " pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.304734 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26ts4\" (UniqueName: \"kubernetes.io/projected/bc1f2f35-9607-4719-993b-8678440d3a0b-kube-api-access-26ts4\") pod \"auto-csr-approver-29535628-vgvfh\" (UID: \"bc1f2f35-9607-4719-993b-8678440d3a0b\") " pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.456459 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.920796 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535628-vgvfh"] Feb 26 20:28:00 crc kubenswrapper[4722]: W0226 20:28:00.922228 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc1f2f35_9607_4719_993b_8678440d3a0b.slice/crio-30084736f1a07c9f97e2b9437880d14fe1b625766c5336b77ba3e38402218d56 WatchSource:0}: Error finding container 30084736f1a07c9f97e2b9437880d14fe1b625766c5336b77ba3e38402218d56: Status 404 returned error can't find the container with id 30084736f1a07c9f97e2b9437880d14fe1b625766c5336b77ba3e38402218d56 Feb 26 20:28:01 crc kubenswrapper[4722]: I0226 20:28:01.421162 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" event={"ID":"bc1f2f35-9607-4719-993b-8678440d3a0b","Type":"ContainerStarted","Data":"30084736f1a07c9f97e2b9437880d14fe1b625766c5336b77ba3e38402218d56"} Feb 26 20:28:02 crc kubenswrapper[4722]: I0226 20:28:02.433264 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" event={"ID":"bc1f2f35-9607-4719-993b-8678440d3a0b","Type":"ContainerStarted","Data":"c1cebce7b43ab6f2d08bae4c675ad61cbdae2db86711e81187ac5336a92a697c"} Feb 26 20:28:02 crc kubenswrapper[4722]: I0226 20:28:02.452343 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" podStartSLOduration=1.3436115659999999 podStartE2EDuration="2.452318032s" podCreationTimestamp="2026-02-26 20:28:00 +0000 UTC" firstStartedPulling="2026-02-26 20:28:00.925226244 +0000 UTC m=+2023.462194168" lastFinishedPulling="2026-02-26 20:28:02.03393271 +0000 UTC m=+2024.570900634" observedRunningTime="2026-02-26 20:28:02.445761495 +0000 UTC m=+2024.982729419" watchObservedRunningTime="2026-02-26 20:28:02.452318032 +0000 UTC m=+2024.989285966" Feb 26 20:28:03 crc kubenswrapper[4722]: I0226 20:28:03.444243 4722 generic.go:334] "Generic (PLEG): container finished" podID="bc1f2f35-9607-4719-993b-8678440d3a0b" containerID="c1cebce7b43ab6f2d08bae4c675ad61cbdae2db86711e81187ac5336a92a697c" exitCode=0 Feb 26 20:28:03 crc kubenswrapper[4722]: I0226 20:28:03.444287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" event={"ID":"bc1f2f35-9607-4719-993b-8678440d3a0b","Type":"ContainerDied","Data":"c1cebce7b43ab6f2d08bae4c675ad61cbdae2db86711e81187ac5336a92a697c"} Feb 26 20:28:04 crc kubenswrapper[4722]: I0226 20:28:04.898742 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.001648 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26ts4\" (UniqueName: \"kubernetes.io/projected/bc1f2f35-9607-4719-993b-8678440d3a0b-kube-api-access-26ts4\") pod \"bc1f2f35-9607-4719-993b-8678440d3a0b\" (UID: \"bc1f2f35-9607-4719-993b-8678440d3a0b\") " Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.006896 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1f2f35-9607-4719-993b-8678440d3a0b-kube-api-access-26ts4" (OuterVolumeSpecName: "kube-api-access-26ts4") pod "bc1f2f35-9607-4719-993b-8678440d3a0b" (UID: "bc1f2f35-9607-4719-993b-8678440d3a0b"). InnerVolumeSpecName "kube-api-access-26ts4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.104480 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26ts4\" (UniqueName: \"kubernetes.io/projected/bc1f2f35-9607-4719-993b-8678440d3a0b-kube-api-access-26ts4\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.465518 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" event={"ID":"bc1f2f35-9607-4719-993b-8678440d3a0b","Type":"ContainerDied","Data":"30084736f1a07c9f97e2b9437880d14fe1b625766c5336b77ba3e38402218d56"} Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.465554 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30084736f1a07c9f97e2b9437880d14fe1b625766c5336b77ba3e38402218d56" Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.465776 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.527255 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535622-nz8ch"] Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.547493 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535622-nz8ch"] Feb 26 20:28:06 crc kubenswrapper[4722]: I0226 20:28:06.159520 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4cf0607-aae4-41cb-9515-5669ed2a4235" path="/var/lib/kubelet/pods/f4cf0607-aae4-41cb-9515-5669ed2a4235/volumes" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.480789 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vcrp4"] Feb 26 20:28:13 crc kubenswrapper[4722]: E0226 20:28:13.485755 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1f2f35-9607-4719-993b-8678440d3a0b" containerName="oc" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.485781 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1f2f35-9607-4719-993b-8678440d3a0b" containerName="oc" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.486023 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1f2f35-9607-4719-993b-8678440d3a0b" containerName="oc" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.487846 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.516408 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcrp4"] Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.568305 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx56f\" (UniqueName: \"kubernetes.io/projected/a6834bce-280f-4d6c-b42a-e469f05008d1-kube-api-access-nx56f\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.568654 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6834bce-280f-4d6c-b42a-e469f05008d1-catalog-content\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.568765 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6834bce-280f-4d6c-b42a-e469f05008d1-utilities\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.671235 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6834bce-280f-4d6c-b42a-e469f05008d1-catalog-content\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.671538 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6834bce-280f-4d6c-b42a-e469f05008d1-utilities\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.671697 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx56f\" (UniqueName: \"kubernetes.io/projected/a6834bce-280f-4d6c-b42a-e469f05008d1-kube-api-access-nx56f\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.671761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6834bce-280f-4d6c-b42a-e469f05008d1-catalog-content\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.671976 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6834bce-280f-4d6c-b42a-e469f05008d1-utilities\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.694240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx56f\" (UniqueName: \"kubernetes.io/projected/a6834bce-280f-4d6c-b42a-e469f05008d1-kube-api-access-nx56f\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.811690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:14 crc kubenswrapper[4722]: W0226 20:28:14.306370 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6834bce_280f_4d6c_b42a_e469f05008d1.slice/crio-63de3b8484b2aaf59058616e00835d809e460cf339dfb374dbd8cfe58a386f3c WatchSource:0}: Error finding container 63de3b8484b2aaf59058616e00835d809e460cf339dfb374dbd8cfe58a386f3c: Status 404 returned error can't find the container with id 63de3b8484b2aaf59058616e00835d809e460cf339dfb374dbd8cfe58a386f3c Feb 26 20:28:14 crc kubenswrapper[4722]: I0226 20:28:14.309800 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcrp4"] Feb 26 20:28:14 crc kubenswrapper[4722]: I0226 20:28:14.547026 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrp4" event={"ID":"a6834bce-280f-4d6c-b42a-e469f05008d1","Type":"ContainerStarted","Data":"bb4fd3f0a9fb5e3af085c72414e39497a21f1fccf088e77e3645bdb721e8672e"} Feb 26 20:28:14 crc kubenswrapper[4722]: I0226 20:28:14.547074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrp4" event={"ID":"a6834bce-280f-4d6c-b42a-e469f05008d1","Type":"ContainerStarted","Data":"63de3b8484b2aaf59058616e00835d809e460cf339dfb374dbd8cfe58a386f3c"} Feb 26 20:28:15 crc kubenswrapper[4722]: I0226 20:28:15.557372 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6834bce-280f-4d6c-b42a-e469f05008d1" containerID="bb4fd3f0a9fb5e3af085c72414e39497a21f1fccf088e77e3645bdb721e8672e" exitCode=0 Feb 26 20:28:15 crc kubenswrapper[4722]: I0226 20:28:15.557423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrp4" event={"ID":"a6834bce-280f-4d6c-b42a-e469f05008d1","Type":"ContainerDied","Data":"bb4fd3f0a9fb5e3af085c72414e39497a21f1fccf088e77e3645bdb721e8672e"} Feb 26 20:28:23 crc kubenswrapper[4722]: I0226 20:28:23.487196 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:28:23 crc kubenswrapper[4722]: I0226 20:28:23.487813 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:28:26 crc kubenswrapper[4722]: I0226 20:28:26.765401 4722 scope.go:117] "RemoveContainer" containerID="03a6b8da21e83ffb59c4cf805d29a8b5cf7140fdc5596ce0196a0f2cca17012d" Feb 26 20:28:27 crc kubenswrapper[4722]: I0226 20:28:27.192102 4722 scope.go:117] "RemoveContainer" containerID="68a6a8b3780fa7e785b92fc5772ce351150e3da25f1c0a02f33bce6d1f924c21" Feb 26 20:28:27 crc kubenswrapper[4722]: I0226 20:28:27.252793 4722 scope.go:117] "RemoveContainer" containerID="672192e703cf8fa85afac0c8cd463702434e5ae8f105603e0cc9cfafc0a59493" Feb 26 20:28:27 crc kubenswrapper[4722]: I0226 20:28:27.696802 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrp4" event={"ID":"a6834bce-280f-4d6c-b42a-e469f05008d1","Type":"ContainerStarted","Data":"656cf092a243435d975f4332cc841f28fe450bc2b325a19e400937099a202742"} Feb 26 20:28:28 crc kubenswrapper[4722]: I0226 20:28:28.706828 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6834bce-280f-4d6c-b42a-e469f05008d1" containerID="656cf092a243435d975f4332cc841f28fe450bc2b325a19e400937099a202742" exitCode=0 Feb 26 20:28:28 crc kubenswrapper[4722]: I0226 20:28:28.706929 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrp4" event={"ID":"a6834bce-280f-4d6c-b42a-e469f05008d1","Type":"ContainerDied","Data":"656cf092a243435d975f4332cc841f28fe450bc2b325a19e400937099a202742"} Feb 26 20:28:29 crc kubenswrapper[4722]: I0226 20:28:29.718580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrp4" event={"ID":"a6834bce-280f-4d6c-b42a-e469f05008d1","Type":"ContainerStarted","Data":"50067abcdcf7199496a3bf160ace527915ff961b118252e86df05e1b347b0c08"} Feb 26 20:28:29 crc kubenswrapper[4722]: I0226 20:28:29.739170 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vcrp4" podStartSLOduration=3.126990373 podStartE2EDuration="16.73915231s" podCreationTimestamp="2026-02-26 20:28:13 +0000 UTC" firstStartedPulling="2026-02-26 20:28:15.559768296 +0000 UTC m=+2038.096736220" lastFinishedPulling="2026-02-26 20:28:29.171930233 +0000 UTC m=+2051.708898157" observedRunningTime="2026-02-26 20:28:29.735506662 +0000 UTC m=+2052.272474596" watchObservedRunningTime="2026-02-26 20:28:29.73915231 +0000 UTC m=+2052.276120244" Feb 26 20:28:33 crc kubenswrapper[4722]: I0226 20:28:33.812548 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:33 crc kubenswrapper[4722]: I0226 20:28:33.813155 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:34 crc kubenswrapper[4722]: I0226 20:28:34.865591 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vcrp4" podUID="a6834bce-280f-4d6c-b42a-e469f05008d1" containerName="registry-server" probeResult="failure" output=< Feb 26 20:28:34 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:28:34 crc kubenswrapper[4722]: > Feb 26 20:28:43 crc kubenswrapper[4722]: I0226 20:28:43.892977 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:43 crc kubenswrapper[4722]: I0226 20:28:43.947675 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:44 crc kubenswrapper[4722]: I0226 20:28:44.510500 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcrp4"] Feb 26 20:28:44 crc kubenswrapper[4722]: I0226 20:28:44.679916 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sj5r4"] Feb 26 20:28:44 crc kubenswrapper[4722]: I0226 20:28:44.680185 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sj5r4" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="registry-server" containerID="cri-o://f6426f570585139c98c6015be2cfcc6e9bfb02be324350403455ee8853d89f3f" gracePeriod=2 Feb 26 20:28:44 crc kubenswrapper[4722]: I0226 20:28:44.882351 4722 generic.go:334] "Generic (PLEG): container finished" podID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerID="f6426f570585139c98c6015be2cfcc6e9bfb02be324350403455ee8853d89f3f" exitCode=0 Feb 26 20:28:44 crc kubenswrapper[4722]: I0226 20:28:44.882443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerDied","Data":"f6426f570585139c98c6015be2cfcc6e9bfb02be324350403455ee8853d89f3f"} Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.227881 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.258262 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-utilities\") pod \"ededdfa7-a21a-4901-bb64-a8f9923a663a\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.258564 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-catalog-content\") pod \"ededdfa7-a21a-4901-bb64-a8f9923a663a\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.258625 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx47k\" (UniqueName: \"kubernetes.io/projected/ededdfa7-a21a-4901-bb64-a8f9923a663a-kube-api-access-nx47k\") pod \"ededdfa7-a21a-4901-bb64-a8f9923a663a\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.260737 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-utilities" (OuterVolumeSpecName: "utilities") pod "ededdfa7-a21a-4901-bb64-a8f9923a663a" (UID: "ededdfa7-a21a-4901-bb64-a8f9923a663a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.270360 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ededdfa7-a21a-4901-bb64-a8f9923a663a-kube-api-access-nx47k" (OuterVolumeSpecName: "kube-api-access-nx47k") pod "ededdfa7-a21a-4901-bb64-a8f9923a663a" (UID: "ededdfa7-a21a-4901-bb64-a8f9923a663a"). InnerVolumeSpecName "kube-api-access-nx47k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.361601 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx47k\" (UniqueName: \"kubernetes.io/projected/ededdfa7-a21a-4901-bb64-a8f9923a663a-kube-api-access-nx47k\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.361631 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.429363 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ededdfa7-a21a-4901-bb64-a8f9923a663a" (UID: "ededdfa7-a21a-4901-bb64-a8f9923a663a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.462657 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.893965 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.894125 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerDied","Data":"8360f30e07db737664b3efe7f2686a21bb31147b31bb7a1bb1e1e8394c5a2f04"} Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.894321 4722 scope.go:117] "RemoveContainer" containerID="f6426f570585139c98c6015be2cfcc6e9bfb02be324350403455ee8853d89f3f" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.928798 4722 scope.go:117] "RemoveContainer" containerID="d41b83d978b9b4d79559a191aea1245600d05f7eb86575e2a7b748bbc06ea3bb" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.933373 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sj5r4"] Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.943037 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sj5r4"] Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.958762 4722 scope.go:117] "RemoveContainer" containerID="18f90a7fe5a5aa6de1fee968e36e72c0c5ef2c92982604086e5b43bc89fb6c6f" Feb 26 20:28:46 crc kubenswrapper[4722]: I0226 20:28:46.158016 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" path="/var/lib/kubelet/pods/ededdfa7-a21a-4901-bb64-a8f9923a663a/volumes" Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.487716 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.488083 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.488169 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.488997 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba00d4572838bf5170760d7a148718dc7d189ec6d3ccd3ff8ee8b29b1ba11ce4"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.489127 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://ba00d4572838bf5170760d7a148718dc7d189ec6d3ccd3ff8ee8b29b1ba11ce4" gracePeriod=600 Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.971430 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="ba00d4572838bf5170760d7a148718dc7d189ec6d3ccd3ff8ee8b29b1ba11ce4" exitCode=0 Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.971517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"ba00d4572838bf5170760d7a148718dc7d189ec6d3ccd3ff8ee8b29b1ba11ce4"} Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.971805 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95"} Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.971830 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:28:57 crc kubenswrapper[4722]: I0226 20:28:57.015115 4722 generic.go:334] "Generic (PLEG): container finished" podID="a0266eb0-8a26-4701-9014-93e0f03724ab" containerID="62a6338929daca235ed7c1aaaf72d91656ce5583f2ff560eb27ea1e123067631" exitCode=0 Feb 26 20:28:57 crc kubenswrapper[4722]: I0226 20:28:57.015249 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" event={"ID":"a0266eb0-8a26-4701-9014-93e0f03724ab","Type":"ContainerDied","Data":"62a6338929daca235ed7c1aaaf72d91656ce5583f2ff560eb27ea1e123067631"} Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.515732 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.636514 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtzpt\" (UniqueName: \"kubernetes.io/projected/a0266eb0-8a26-4701-9014-93e0f03724ab-kube-api-access-gtzpt\") pod \"a0266eb0-8a26-4701-9014-93e0f03724ab\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.636636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-inventory\") pod \"a0266eb0-8a26-4701-9014-93e0f03724ab\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.636756 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ssh-key-openstack-edpm-ipam\") pod \"a0266eb0-8a26-4701-9014-93e0f03724ab\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.636795 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ovn-combined-ca-bundle\") pod \"a0266eb0-8a26-4701-9014-93e0f03724ab\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.636821 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0266eb0-8a26-4701-9014-93e0f03724ab-ovncontroller-config-0\") pod \"a0266eb0-8a26-4701-9014-93e0f03724ab\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.652363 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0266eb0-8a26-4701-9014-93e0f03724ab-kube-api-access-gtzpt" (OuterVolumeSpecName: "kube-api-access-gtzpt") pod "a0266eb0-8a26-4701-9014-93e0f03724ab" (UID: "a0266eb0-8a26-4701-9014-93e0f03724ab"). InnerVolumeSpecName "kube-api-access-gtzpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.652802 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a0266eb0-8a26-4701-9014-93e0f03724ab" (UID: "a0266eb0-8a26-4701-9014-93e0f03724ab"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.664511 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a0266eb0-8a26-4701-9014-93e0f03724ab" (UID: "a0266eb0-8a26-4701-9014-93e0f03724ab"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.666293 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-inventory" (OuterVolumeSpecName: "inventory") pod "a0266eb0-8a26-4701-9014-93e0f03724ab" (UID: "a0266eb0-8a26-4701-9014-93e0f03724ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.672954 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0266eb0-8a26-4701-9014-93e0f03724ab-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a0266eb0-8a26-4701-9014-93e0f03724ab" (UID: "a0266eb0-8a26-4701-9014-93e0f03724ab"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.738980 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.739018 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.739031 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.739039 4722 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0266eb0-8a26-4701-9014-93e0f03724ab-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.739048 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtzpt\" (UniqueName: \"kubernetes.io/projected/a0266eb0-8a26-4701-9014-93e0f03724ab-kube-api-access-gtzpt\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.037198 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" event={"ID":"a0266eb0-8a26-4701-9014-93e0f03724ab","Type":"ContainerDied","Data":"9fab02decd236c67dfdd4b2188f4c87033a27f609ea3e2bd0d343f933286918b"} Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.037243 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fab02decd236c67dfdd4b2188f4c87033a27f609ea3e2bd0d343f933286918b" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.037262 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.133496 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2"] Feb 26 20:28:59 crc kubenswrapper[4722]: E0226 20:28:59.133913 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="extract-utilities" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.133931 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="extract-utilities" Feb 26 20:28:59 crc kubenswrapper[4722]: E0226 20:28:59.133944 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0266eb0-8a26-4701-9014-93e0f03724ab" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.133951 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0266eb0-8a26-4701-9014-93e0f03724ab" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 26 20:28:59 crc kubenswrapper[4722]: E0226 20:28:59.133967 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="extract-content" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.133974 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="extract-content" Feb 26 20:28:59 crc kubenswrapper[4722]: E0226 20:28:59.134005 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="registry-server" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.134011 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="registry-server" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.134226 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0266eb0-8a26-4701-9014-93e0f03724ab" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.134239 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="registry-server" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.134920 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.137750 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.137879 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.137917 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.138795 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.139641 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.142408 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.150451 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2"] Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.249004 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.249092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk9vd\" (UniqueName: \"kubernetes.io/projected/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-kube-api-access-jk9vd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.249121 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.249458 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.249571 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.249875 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.351632 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.351735 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk9vd\" (UniqueName: \"kubernetes.io/projected/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-kube-api-access-jk9vd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.351775 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.351834 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.351874 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.351968 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.356925 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.357554 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.357998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.358407 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.362391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.380545 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk9vd\" (UniqueName: \"kubernetes.io/projected/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-kube-api-access-jk9vd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.458356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:29:00 crc kubenswrapper[4722]: I0226 20:29:00.008874 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2"] Feb 26 20:29:00 crc kubenswrapper[4722]: I0226 20:29:00.049735 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" event={"ID":"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8","Type":"ContainerStarted","Data":"a17b0614c8ecaed43666512f56398c6edb9f2a30cd32e34c168b76f0ead38dd2"} Feb 26 20:29:01 crc kubenswrapper[4722]: I0226 20:29:01.060494 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" event={"ID":"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8","Type":"ContainerStarted","Data":"8b70c4944824ee03a367ef5fc50a4483475591acdde67262efc97d21684c1abe"} Feb 26 20:29:01 crc kubenswrapper[4722]: I0226 20:29:01.086606 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" podStartSLOduration=1.5966530749999999 podStartE2EDuration="2.086585197s" podCreationTimestamp="2026-02-26 20:28:59 +0000 UTC" firstStartedPulling="2026-02-26 20:29:00.013508577 +0000 UTC m=+2082.550476501" lastFinishedPulling="2026-02-26 20:29:00.503440689 +0000 UTC m=+2083.040408623" observedRunningTime="2026-02-26 20:29:01.076296128 +0000 UTC m=+2083.613264082" watchObservedRunningTime="2026-02-26 20:29:01.086585197 +0000 UTC m=+2083.623553131" Feb 26 20:29:47 crc kubenswrapper[4722]: I0226 20:29:47.517365 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" containerID="8b70c4944824ee03a367ef5fc50a4483475591acdde67262efc97d21684c1abe" exitCode=0 Feb 26 20:29:47 crc kubenswrapper[4722]: I0226 20:29:47.517453 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" event={"ID":"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8","Type":"ContainerDied","Data":"8b70c4944824ee03a367ef5fc50a4483475591acdde67262efc97d21684c1abe"} Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.028987 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.092945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk9vd\" (UniqueName: \"kubernetes.io/projected/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-kube-api-access-jk9vd\") pod \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.093014 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-ssh-key-openstack-edpm-ipam\") pod \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.093056 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-metadata-combined-ca-bundle\") pod \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.093088 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.093203 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-nova-metadata-neutron-config-0\") pod \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.093306 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-inventory\") pod \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.099004 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-kube-api-access-jk9vd" (OuterVolumeSpecName: "kube-api-access-jk9vd") pod "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" (UID: "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8"). InnerVolumeSpecName "kube-api-access-jk9vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.099494 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" (UID: "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.128379 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" (UID: "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.129477 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-inventory" (OuterVolumeSpecName: "inventory") pod "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" (UID: "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.133701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" (UID: "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.134186 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" (UID: "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.196220 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.196253 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.196283 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk9vd\" (UniqueName: \"kubernetes.io/projected/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-kube-api-access-jk9vd\") on node \"crc\" DevicePath \"\"" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.196295 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.196306 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.196317 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.535946 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" event={"ID":"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8","Type":"ContainerDied","Data":"a17b0614c8ecaed43666512f56398c6edb9f2a30cd32e34c168b76f0ead38dd2"} Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.535999 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a17b0614c8ecaed43666512f56398c6edb9f2a30cd32e34c168b76f0ead38dd2" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.536008 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.628937 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq"] Feb 26 20:29:49 crc kubenswrapper[4722]: E0226 20:29:49.630603 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.630633 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.630967 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.631899 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.633651 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.633657 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.634014 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.634896 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.637781 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.666746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq"] Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.806896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.806954 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.807295 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqbbv\" (UniqueName: \"kubernetes.io/projected/32f8d32f-af41-44a8-a252-50bdabeeab06-kube-api-access-zqbbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.807475 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.807592 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.909158 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.909295 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.909396 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.909452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.910203 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqbbv\" (UniqueName: \"kubernetes.io/projected/32f8d32f-af41-44a8-a252-50bdabeeab06-kube-api-access-zqbbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.914078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.914976 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.917302 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.926560 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.927095 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqbbv\" (UniqueName: \"kubernetes.io/projected/32f8d32f-af41-44a8-a252-50bdabeeab06-kube-api-access-zqbbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.949405 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:50 crc kubenswrapper[4722]: I0226 20:29:50.565838 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq"] Feb 26 20:29:51 crc kubenswrapper[4722]: I0226 20:29:51.557853 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" event={"ID":"32f8d32f-af41-44a8-a252-50bdabeeab06","Type":"ContainerStarted","Data":"1666de71d0bb791ce023cf0612a1d0fdcfa096bae58628429c26a2d0694817b0"} Feb 26 20:29:51 crc kubenswrapper[4722]: I0226 20:29:51.557906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" event={"ID":"32f8d32f-af41-44a8-a252-50bdabeeab06","Type":"ContainerStarted","Data":"079d6c1d13e59492caf154e78804d130906d8b000ee6b893375d937d1314b58b"} Feb 26 20:29:51 crc kubenswrapper[4722]: I0226 20:29:51.581829 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" podStartSLOduration=2.166613803 podStartE2EDuration="2.581812939s" podCreationTimestamp="2026-02-26 20:29:49 +0000 UTC" firstStartedPulling="2026-02-26 20:29:50.559126205 +0000 UTC m=+2133.096094129" lastFinishedPulling="2026-02-26 20:29:50.974325341 +0000 UTC m=+2133.511293265" observedRunningTime="2026-02-26 20:29:51.575707504 +0000 UTC m=+2134.112675458" watchObservedRunningTime="2026-02-26 20:29:51.581812939 +0000 UTC m=+2134.118780863" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.141582 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535630-cqvlb"] Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.143687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.155616 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.155695 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.161734 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.205580 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt"] Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.207327 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535630-cqvlb"] Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.207422 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.213788 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt"] Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.217875 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.218103 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.219317 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk8bd\" (UniqueName: \"kubernetes.io/projected/60589e31-13a5-410b-926f-511d262459da-kube-api-access-vk8bd\") pod \"auto-csr-approver-29535630-cqvlb\" (UID: \"60589e31-13a5-410b-926f-511d262459da\") " pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.321177 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zfwb\" (UniqueName: \"kubernetes.io/projected/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-kube-api-access-6zfwb\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.321238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-config-volume\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.321273 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk8bd\" (UniqueName: \"kubernetes.io/projected/60589e31-13a5-410b-926f-511d262459da-kube-api-access-vk8bd\") pod \"auto-csr-approver-29535630-cqvlb\" (UID: \"60589e31-13a5-410b-926f-511d262459da\") " pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.321294 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-secret-volume\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.347216 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk8bd\" (UniqueName: \"kubernetes.io/projected/60589e31-13a5-410b-926f-511d262459da-kube-api-access-vk8bd\") pod \"auto-csr-approver-29535630-cqvlb\" (UID: \"60589e31-13a5-410b-926f-511d262459da\") " pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.423708 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zfwb\" (UniqueName: \"kubernetes.io/projected/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-kube-api-access-6zfwb\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.423800 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-config-volume\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.423848 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-secret-volume\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.425189 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-config-volume\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.429112 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-secret-volume\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.439161 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zfwb\" (UniqueName: \"kubernetes.io/projected/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-kube-api-access-6zfwb\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.506036 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.540053 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:01 crc kubenswrapper[4722]: I0226 20:30:01.007685 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535630-cqvlb"] Feb 26 20:30:01 crc kubenswrapper[4722]: W0226 20:30:01.101803 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e92f32d_0ad8_4cd5_97d6_cd76d298bb1f.slice/crio-101300f48afb503a31561eb86bd4a48c1c3bc83e76d3907ac73a1be70811e1d7 WatchSource:0}: Error finding container 101300f48afb503a31561eb86bd4a48c1c3bc83e76d3907ac73a1be70811e1d7: Status 404 returned error can't find the container with id 101300f48afb503a31561eb86bd4a48c1c3bc83e76d3907ac73a1be70811e1d7 Feb 26 20:30:01 crc kubenswrapper[4722]: I0226 20:30:01.106507 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt"] Feb 26 20:30:01 crc kubenswrapper[4722]: I0226 20:30:01.687177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" event={"ID":"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f","Type":"ContainerStarted","Data":"93338f2fc980a6f1074f31d31f8fdabb6cfcc657796658350bf7831c08cece8b"} Feb 26 20:30:01 crc kubenswrapper[4722]: I0226 20:30:01.687494 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" event={"ID":"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f","Type":"ContainerStarted","Data":"101300f48afb503a31561eb86bd4a48c1c3bc83e76d3907ac73a1be70811e1d7"} Feb 26 20:30:01 crc kubenswrapper[4722]: I0226 20:30:01.689269 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" event={"ID":"60589e31-13a5-410b-926f-511d262459da","Type":"ContainerStarted","Data":"04efe5a84558ec8dd9ff5a778d6dd3b52a06500213363238c6d782db6d7b52e9"} Feb 26 20:30:01 crc kubenswrapper[4722]: I0226 20:30:01.710264 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" podStartSLOduration=1.710245575 podStartE2EDuration="1.710245575s" podCreationTimestamp="2026-02-26 20:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:30:01.701601829 +0000 UTC m=+2144.238569773" watchObservedRunningTime="2026-02-26 20:30:01.710245575 +0000 UTC m=+2144.247213499" Feb 26 20:30:02 crc kubenswrapper[4722]: I0226 20:30:02.698089 4722 generic.go:334] "Generic (PLEG): container finished" podID="7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" containerID="93338f2fc980a6f1074f31d31f8fdabb6cfcc657796658350bf7831c08cece8b" exitCode=0 Feb 26 20:30:02 crc kubenswrapper[4722]: I0226 20:30:02.698130 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" event={"ID":"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f","Type":"ContainerDied","Data":"93338f2fc980a6f1074f31d31f8fdabb6cfcc657796658350bf7831c08cece8b"} Feb 26 20:30:03 crc kubenswrapper[4722]: I0226 20:30:03.708237 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" event={"ID":"60589e31-13a5-410b-926f-511d262459da","Type":"ContainerStarted","Data":"d3b55ba26e0f272a7f8435619915e8cb32ff47a5c94e9bbf107e40479e66543f"} Feb 26 20:30:03 crc kubenswrapper[4722]: I0226 20:30:03.736062 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" podStartSLOduration=1.8453544370000001 podStartE2EDuration="3.736040152s" podCreationTimestamp="2026-02-26 20:30:00 +0000 UTC" firstStartedPulling="2026-02-26 20:30:01.010376772 +0000 UTC m=+2143.547344696" lastFinishedPulling="2026-02-26 20:30:02.901062487 +0000 UTC m=+2145.438030411" observedRunningTime="2026-02-26 20:30:03.725210678 +0000 UTC m=+2146.262178612" watchObservedRunningTime="2026-02-26 20:30:03.736040152 +0000 UTC m=+2146.273008086" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.224666 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.295660 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-config-volume\") pod \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.295886 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-secret-volume\") pod \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.296030 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zfwb\" (UniqueName: \"kubernetes.io/projected/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-kube-api-access-6zfwb\") pod \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.296906 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" (UID: "7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.302287 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-kube-api-access-6zfwb" (OuterVolumeSpecName: "kube-api-access-6zfwb") pod "7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" (UID: "7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f"). InnerVolumeSpecName "kube-api-access-6zfwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.303773 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" (UID: "7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.399048 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.399087 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.399097 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zfwb\" (UniqueName: \"kubernetes.io/projected/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-kube-api-access-6zfwb\") on node \"crc\" DevicePath \"\"" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.717552 4722 generic.go:334] "Generic (PLEG): container finished" podID="60589e31-13a5-410b-926f-511d262459da" containerID="d3b55ba26e0f272a7f8435619915e8cb32ff47a5c94e9bbf107e40479e66543f" exitCode=0 Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.717757 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" event={"ID":"60589e31-13a5-410b-926f-511d262459da","Type":"ContainerDied","Data":"d3b55ba26e0f272a7f8435619915e8cb32ff47a5c94e9bbf107e40479e66543f"} Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.719992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" event={"ID":"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f","Type":"ContainerDied","Data":"101300f48afb503a31561eb86bd4a48c1c3bc83e76d3907ac73a1be70811e1d7"} Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.720017 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.720022 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="101300f48afb503a31561eb86bd4a48c1c3bc83e76d3907ac73a1be70811e1d7" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.776663 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws"] Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.787295 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws"] Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.163754 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13fa204-edf6-4e71-87c7-2a5d7603a100" path="/var/lib/kubelet/pods/a13fa204-edf6-4e71-87c7-2a5d7603a100/volumes" Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.203640 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.236465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk8bd\" (UniqueName: \"kubernetes.io/projected/60589e31-13a5-410b-926f-511d262459da-kube-api-access-vk8bd\") pod \"60589e31-13a5-410b-926f-511d262459da\" (UID: \"60589e31-13a5-410b-926f-511d262459da\") " Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.249423 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60589e31-13a5-410b-926f-511d262459da-kube-api-access-vk8bd" (OuterVolumeSpecName: "kube-api-access-vk8bd") pod "60589e31-13a5-410b-926f-511d262459da" (UID: "60589e31-13a5-410b-926f-511d262459da"). InnerVolumeSpecName "kube-api-access-vk8bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.339784 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk8bd\" (UniqueName: \"kubernetes.io/projected/60589e31-13a5-410b-926f-511d262459da-kube-api-access-vk8bd\") on node \"crc\" DevicePath \"\"" Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.740866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" event={"ID":"60589e31-13a5-410b-926f-511d262459da","Type":"ContainerDied","Data":"04efe5a84558ec8dd9ff5a778d6dd3b52a06500213363238c6d782db6d7b52e9"} Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.740904 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04efe5a84558ec8dd9ff5a778d6dd3b52a06500213363238c6d782db6d7b52e9" Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.740919 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.803114 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535624-fp9nm"] Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.829312 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535624-fp9nm"] Feb 26 20:30:08 crc kubenswrapper[4722]: I0226 20:30:08.171079 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37270d6e-59ab-4ed7-872d-629514b0727b" path="/var/lib/kubelet/pods/37270d6e-59ab-4ed7-872d-629514b0727b/volumes" Feb 26 20:30:27 crc kubenswrapper[4722]: I0226 20:30:27.414813 4722 scope.go:117] "RemoveContainer" containerID="88c213f62e12dbb0dd1f6360f1a6e19c1f15f5006140bee25ff8068b5724daf6" Feb 26 20:30:27 crc kubenswrapper[4722]: I0226 20:30:27.444763 4722 scope.go:117] "RemoveContainer" containerID="fb338752d8ecf09bc96fe18b7e92a49079b49e325de14c839174d5b1c91826af" Feb 26 20:30:53 crc kubenswrapper[4722]: I0226 20:30:53.487192 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:30:53 crc kubenswrapper[4722]: I0226 20:30:53.487762 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:31:23 crc kubenswrapper[4722]: I0226 20:31:23.487923 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:31:23 crc kubenswrapper[4722]: I0226 20:31:23.488928 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:31:53 crc kubenswrapper[4722]: I0226 20:31:53.487825 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:31:53 crc kubenswrapper[4722]: I0226 20:31:53.489156 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:31:53 crc kubenswrapper[4722]: I0226 20:31:53.489305 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:31:53 crc kubenswrapper[4722]: I0226 20:31:53.490315 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:31:53 crc kubenswrapper[4722]: I0226 20:31:53.490454 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" gracePeriod=600 Feb 26 20:31:53 crc kubenswrapper[4722]: E0226 20:31:53.624475 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:31:54 crc kubenswrapper[4722]: I0226 20:31:54.628054 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" exitCode=0 Feb 26 20:31:54 crc kubenswrapper[4722]: I0226 20:31:54.628116 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95"} Feb 26 20:31:54 crc kubenswrapper[4722]: I0226 20:31:54.628187 4722 scope.go:117] "RemoveContainer" containerID="ba00d4572838bf5170760d7a148718dc7d189ec6d3ccd3ff8ee8b29b1ba11ce4" Feb 26 20:31:54 crc kubenswrapper[4722]: I0226 20:31:54.629022 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:31:54 crc kubenswrapper[4722]: E0226 20:31:54.629504 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.163054 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535632-hsvqm"] Feb 26 20:32:00 crc kubenswrapper[4722]: E0226 20:32:00.164072 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60589e31-13a5-410b-926f-511d262459da" containerName="oc" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.164291 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="60589e31-13a5-410b-926f-511d262459da" containerName="oc" Feb 26 20:32:00 crc kubenswrapper[4722]: E0226 20:32:00.164397 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" containerName="collect-profiles" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.164408 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" containerName="collect-profiles" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.164662 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="60589e31-13a5-410b-926f-511d262459da" containerName="oc" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.164701 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" containerName="collect-profiles" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.165590 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535632-hsvqm"] Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.165695 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.169482 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.169576 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.169504 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.256739 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8hkw\" (UniqueName: \"kubernetes.io/projected/da7ba56d-affb-4cc4-ba3e-d43c0265d472-kube-api-access-n8hkw\") pod \"auto-csr-approver-29535632-hsvqm\" (UID: \"da7ba56d-affb-4cc4-ba3e-d43c0265d472\") " pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.359456 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8hkw\" (UniqueName: \"kubernetes.io/projected/da7ba56d-affb-4cc4-ba3e-d43c0265d472-kube-api-access-n8hkw\") pod \"auto-csr-approver-29535632-hsvqm\" (UID: \"da7ba56d-affb-4cc4-ba3e-d43c0265d472\") " pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.378540 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8hkw\" (UniqueName: \"kubernetes.io/projected/da7ba56d-affb-4cc4-ba3e-d43c0265d472-kube-api-access-n8hkw\") pod \"auto-csr-approver-29535632-hsvqm\" (UID: \"da7ba56d-affb-4cc4-ba3e-d43c0265d472\") " pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.492941 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.944800 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.948032 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535632-hsvqm"] Feb 26 20:32:01 crc kubenswrapper[4722]: I0226 20:32:01.103976 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-5b495fbf79-442st" podUID="d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 26 20:32:01 crc kubenswrapper[4722]: I0226 20:32:01.700396 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" event={"ID":"da7ba56d-affb-4cc4-ba3e-d43c0265d472","Type":"ContainerStarted","Data":"e0c900a3496b48a1764648e8c7fe67afb46d038c289e4cbdb47b59798d7c4b98"} Feb 26 20:32:02 crc kubenswrapper[4722]: I0226 20:32:02.713096 4722 generic.go:334] "Generic (PLEG): container finished" podID="da7ba56d-affb-4cc4-ba3e-d43c0265d472" containerID="f09f99460ab7d3a2048c5dab9049e6932d194573ee589cfabc4fe12c1a81582a" exitCode=0 Feb 26 20:32:02 crc kubenswrapper[4722]: I0226 20:32:02.713448 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" event={"ID":"da7ba56d-affb-4cc4-ba3e-d43c0265d472","Type":"ContainerDied","Data":"f09f99460ab7d3a2048c5dab9049e6932d194573ee589cfabc4fe12c1a81582a"} Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.144679 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.260871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8hkw\" (UniqueName: \"kubernetes.io/projected/da7ba56d-affb-4cc4-ba3e-d43c0265d472-kube-api-access-n8hkw\") pod \"da7ba56d-affb-4cc4-ba3e-d43c0265d472\" (UID: \"da7ba56d-affb-4cc4-ba3e-d43c0265d472\") " Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.268907 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7ba56d-affb-4cc4-ba3e-d43c0265d472-kube-api-access-n8hkw" (OuterVolumeSpecName: "kube-api-access-n8hkw") pod "da7ba56d-affb-4cc4-ba3e-d43c0265d472" (UID: "da7ba56d-affb-4cc4-ba3e-d43c0265d472"). InnerVolumeSpecName "kube-api-access-n8hkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.363576 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8hkw\" (UniqueName: \"kubernetes.io/projected/da7ba56d-affb-4cc4-ba3e-d43c0265d472-kube-api-access-n8hkw\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.733931 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" event={"ID":"da7ba56d-affb-4cc4-ba3e-d43c0265d472","Type":"ContainerDied","Data":"e0c900a3496b48a1764648e8c7fe67afb46d038c289e4cbdb47b59798d7c4b98"} Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.733980 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0c900a3496b48a1764648e8c7fe67afb46d038c289e4cbdb47b59798d7c4b98" Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.734032 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.221469 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535626-pxhv7"] Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.229338 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535626-pxhv7"] Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.613968 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-56spq"] Feb 26 20:32:05 crc kubenswrapper[4722]: E0226 20:32:05.614402 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7ba56d-affb-4cc4-ba3e-d43c0265d472" containerName="oc" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.614415 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7ba56d-affb-4cc4-ba3e-d43c0265d472" containerName="oc" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.614592 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="da7ba56d-affb-4cc4-ba3e-d43c0265d472" containerName="oc" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.615996 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.626850 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56spq"] Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.694041 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-utilities\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.694107 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-catalog-content\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.694388 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54nvg\" (UniqueName: \"kubernetes.io/projected/127e8312-827e-443e-b392-e676f996d05d-kube-api-access-54nvg\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.796777 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54nvg\" (UniqueName: \"kubernetes.io/projected/127e8312-827e-443e-b392-e676f996d05d-kube-api-access-54nvg\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.796908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-utilities\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.796941 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-catalog-content\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.797539 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-utilities\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.797552 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-catalog-content\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.814078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54nvg\" (UniqueName: \"kubernetes.io/projected/127e8312-827e-443e-b392-e676f996d05d-kube-api-access-54nvg\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.947259 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:06 crc kubenswrapper[4722]: I0226 20:32:06.170958 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b25625-2a04-40bd-b7db-f6fa3b1fc25f" path="/var/lib/kubelet/pods/89b25625-2a04-40bd-b7db-f6fa3b1fc25f/volumes" Feb 26 20:32:06 crc kubenswrapper[4722]: I0226 20:32:06.500976 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56spq"] Feb 26 20:32:06 crc kubenswrapper[4722]: W0226 20:32:06.505377 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod127e8312_827e_443e_b392_e676f996d05d.slice/crio-9ea2beb6a696aade853fa10035dd9a91eb9daed82e308f57006cf316841fa224 WatchSource:0}: Error finding container 9ea2beb6a696aade853fa10035dd9a91eb9daed82e308f57006cf316841fa224: Status 404 returned error can't find the container with id 9ea2beb6a696aade853fa10035dd9a91eb9daed82e308f57006cf316841fa224 Feb 26 20:32:06 crc kubenswrapper[4722]: I0226 20:32:06.751205 4722 generic.go:334] "Generic (PLEG): container finished" podID="127e8312-827e-443e-b392-e676f996d05d" containerID="05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0" exitCode=0 Feb 26 20:32:06 crc kubenswrapper[4722]: I0226 20:32:06.751250 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerDied","Data":"05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0"} Feb 26 20:32:06 crc kubenswrapper[4722]: I0226 20:32:06.751280 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerStarted","Data":"9ea2beb6a696aade853fa10035dd9a91eb9daed82e308f57006cf316841fa224"} Feb 26 20:32:07 crc kubenswrapper[4722]: I0226 20:32:07.762507 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerStarted","Data":"e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915"} Feb 26 20:32:09 crc kubenswrapper[4722]: I0226 20:32:09.790103 4722 generic.go:334] "Generic (PLEG): container finished" podID="127e8312-827e-443e-b392-e676f996d05d" containerID="e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915" exitCode=0 Feb 26 20:32:09 crc kubenswrapper[4722]: I0226 20:32:09.790552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerDied","Data":"e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915"} Feb 26 20:32:10 crc kubenswrapper[4722]: I0226 20:32:10.146312 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:32:10 crc kubenswrapper[4722]: E0226 20:32:10.147102 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:32:10 crc kubenswrapper[4722]: I0226 20:32:10.803764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerStarted","Data":"fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed"} Feb 26 20:32:10 crc kubenswrapper[4722]: I0226 20:32:10.820243 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-56spq" podStartSLOduration=2.13980787 podStartE2EDuration="5.820227645s" podCreationTimestamp="2026-02-26 20:32:05 +0000 UTC" firstStartedPulling="2026-02-26 20:32:06.753103086 +0000 UTC m=+2269.290071010" lastFinishedPulling="2026-02-26 20:32:10.433522871 +0000 UTC m=+2272.970490785" observedRunningTime="2026-02-26 20:32:10.817626474 +0000 UTC m=+2273.354594408" watchObservedRunningTime="2026-02-26 20:32:10.820227645 +0000 UTC m=+2273.357195569" Feb 26 20:32:15 crc kubenswrapper[4722]: I0226 20:32:15.947380 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:15 crc kubenswrapper[4722]: I0226 20:32:15.948482 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:16 crc kubenswrapper[4722]: I0226 20:32:16.002826 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:16 crc kubenswrapper[4722]: I0226 20:32:16.924423 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:16 crc kubenswrapper[4722]: I0226 20:32:16.977626 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-56spq"] Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.671332 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j8fkd"] Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.673766 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.687748 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j8fkd"] Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.820203 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-utilities\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.820960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-catalog-content\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.821034 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmpqb\" (UniqueName: \"kubernetes.io/projected/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-kube-api-access-zmpqb\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.881834 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-56spq" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="registry-server" containerID="cri-o://fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed" gracePeriod=2 Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.922733 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-utilities\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.922928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-catalog-content\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.922989 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmpqb\" (UniqueName: \"kubernetes.io/projected/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-kube-api-access-zmpqb\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.923242 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-utilities\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.923472 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-catalog-content\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.949258 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmpqb\" (UniqueName: \"kubernetes.io/projected/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-kube-api-access-zmpqb\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.020030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.530641 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.642583 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j8fkd"] Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.644501 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-utilities\") pod \"127e8312-827e-443e-b392-e676f996d05d\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.644647 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54nvg\" (UniqueName: \"kubernetes.io/projected/127e8312-827e-443e-b392-e676f996d05d-kube-api-access-54nvg\") pod \"127e8312-827e-443e-b392-e676f996d05d\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.644791 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-catalog-content\") pod \"127e8312-827e-443e-b392-e676f996d05d\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.646632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-utilities" (OuterVolumeSpecName: "utilities") pod "127e8312-827e-443e-b392-e676f996d05d" (UID: "127e8312-827e-443e-b392-e676f996d05d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.652319 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/127e8312-827e-443e-b392-e676f996d05d-kube-api-access-54nvg" (OuterVolumeSpecName: "kube-api-access-54nvg") pod "127e8312-827e-443e-b392-e676f996d05d" (UID: "127e8312-827e-443e-b392-e676f996d05d"). InnerVolumeSpecName "kube-api-access-54nvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.695119 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "127e8312-827e-443e-b392-e676f996d05d" (UID: "127e8312-827e-443e-b392-e676f996d05d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.746966 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.747011 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54nvg\" (UniqueName: \"kubernetes.io/projected/127e8312-827e-443e-b392-e676f996d05d-kube-api-access-54nvg\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.747026 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.895611 4722 generic.go:334] "Generic (PLEG): container finished" podID="127e8312-827e-443e-b392-e676f996d05d" containerID="fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed" exitCode=0 Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.895703 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerDied","Data":"fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed"} Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.895738 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.895752 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerDied","Data":"9ea2beb6a696aade853fa10035dd9a91eb9daed82e308f57006cf316841fa224"} Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.895777 4722 scope.go:117] "RemoveContainer" containerID="fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.899010 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerID="bda194e1ad7e8f4a41c3bef05958c52f05c21cb2d8a911235c451cf4aeb52e6b" exitCode=0 Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.899093 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerDied","Data":"bda194e1ad7e8f4a41c3bef05958c52f05c21cb2d8a911235c451cf4aeb52e6b"} Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.899317 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerStarted","Data":"c0edbcda9c3f7b679705f2855bc92f1439d6bbda0a5757f9bb8b30442e913f92"} Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.959274 4722 scope.go:117] "RemoveContainer" containerID="e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.985153 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-56spq"] Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.993723 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-56spq"] Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.033528 4722 scope.go:117] "RemoveContainer" containerID="05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.100116 4722 scope.go:117] "RemoveContainer" containerID="fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed" Feb 26 20:32:20 crc kubenswrapper[4722]: E0226 20:32:20.100617 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed\": container with ID starting with fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed not found: ID does not exist" containerID="fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.100742 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed"} err="failed to get container status \"fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed\": rpc error: code = NotFound desc = could not find container \"fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed\": container with ID starting with fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed not found: ID does not exist" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.100833 4722 scope.go:117] "RemoveContainer" containerID="e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915" Feb 26 20:32:20 crc kubenswrapper[4722]: E0226 20:32:20.101307 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915\": container with ID starting with e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915 not found: ID does not exist" containerID="e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.101345 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915"} err="failed to get container status \"e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915\": rpc error: code = NotFound desc = could not find container \"e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915\": container with ID starting with e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915 not found: ID does not exist" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.101372 4722 scope.go:117] "RemoveContainer" containerID="05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0" Feb 26 20:32:20 crc kubenswrapper[4722]: E0226 20:32:20.101680 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0\": container with ID starting with 05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0 not found: ID does not exist" containerID="05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.101772 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0"} err="failed to get container status \"05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0\": rpc error: code = NotFound desc = could not find container \"05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0\": container with ID starting with 05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0 not found: ID does not exist" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.155780 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="127e8312-827e-443e-b392-e676f996d05d" path="/var/lib/kubelet/pods/127e8312-827e-443e-b392-e676f996d05d/volumes" Feb 26 20:32:21 crc kubenswrapper[4722]: I0226 20:32:21.929195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerStarted","Data":"0c6f6371a72fc2e229b928696e3e29e23c57574c3cfe1a40dace460a5209963e"} Feb 26 20:32:22 crc kubenswrapper[4722]: I0226 20:32:22.946898 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerID="0c6f6371a72fc2e229b928696e3e29e23c57574c3cfe1a40dace460a5209963e" exitCode=0 Feb 26 20:32:22 crc kubenswrapper[4722]: I0226 20:32:22.947105 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerDied","Data":"0c6f6371a72fc2e229b928696e3e29e23c57574c3cfe1a40dace460a5209963e"} Feb 26 20:32:24 crc kubenswrapper[4722]: I0226 20:32:24.146130 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:32:24 crc kubenswrapper[4722]: E0226 20:32:24.146615 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:32:24 crc kubenswrapper[4722]: I0226 20:32:24.972199 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerStarted","Data":"9dd75205c4c5942d39d1d1be74597216ea9be942ec22ace5e67be32d5d10e06e"} Feb 26 20:32:24 crc kubenswrapper[4722]: I0226 20:32:24.999752 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j8fkd" podStartSLOduration=2.53834505 podStartE2EDuration="6.999732001s" podCreationTimestamp="2026-02-26 20:32:18 +0000 UTC" firstStartedPulling="2026-02-26 20:32:19.901047924 +0000 UTC m=+2282.438015848" lastFinishedPulling="2026-02-26 20:32:24.362434875 +0000 UTC m=+2286.899402799" observedRunningTime="2026-02-26 20:32:24.99450948 +0000 UTC m=+2287.531477404" watchObservedRunningTime="2026-02-26 20:32:24.999732001 +0000 UTC m=+2287.536699925" Feb 26 20:32:27 crc kubenswrapper[4722]: I0226 20:32:27.601018 4722 scope.go:117] "RemoveContainer" containerID="1346d1fa251cda83e5d2662800a177ca6e9b4d25494bd8493c12fad71cc6d0b7" Feb 26 20:32:29 crc kubenswrapper[4722]: I0226 20:32:29.020464 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:29 crc kubenswrapper[4722]: I0226 20:32:29.020815 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:29 crc kubenswrapper[4722]: I0226 20:32:29.082877 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:30 crc kubenswrapper[4722]: I0226 20:32:30.071819 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:30 crc kubenswrapper[4722]: I0226 20:32:30.120459 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j8fkd"] Feb 26 20:32:32 crc kubenswrapper[4722]: I0226 20:32:32.039464 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j8fkd" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="registry-server" containerID="cri-o://9dd75205c4c5942d39d1d1be74597216ea9be942ec22ace5e67be32d5d10e06e" gracePeriod=2 Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.051510 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerID="9dd75205c4c5942d39d1d1be74597216ea9be942ec22ace5e67be32d5d10e06e" exitCode=0 Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.051540 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerDied","Data":"9dd75205c4c5942d39d1d1be74597216ea9be942ec22ace5e67be32d5d10e06e"} Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.716647 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.870620 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-catalog-content\") pod \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.870657 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-utilities\") pod \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.870777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmpqb\" (UniqueName: \"kubernetes.io/projected/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-kube-api-access-zmpqb\") pod \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.871973 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-utilities" (OuterVolumeSpecName: "utilities") pod "3bd5fe57-e4ed-4f01-b933-8d85a6abb368" (UID: "3bd5fe57-e4ed-4f01-b933-8d85a6abb368"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.876427 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-kube-api-access-zmpqb" (OuterVolumeSpecName: "kube-api-access-zmpqb") pod "3bd5fe57-e4ed-4f01-b933-8d85a6abb368" (UID: "3bd5fe57-e4ed-4f01-b933-8d85a6abb368"). InnerVolumeSpecName "kube-api-access-zmpqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.936587 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bd5fe57-e4ed-4f01-b933-8d85a6abb368" (UID: "3bd5fe57-e4ed-4f01-b933-8d85a6abb368"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.973216 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.973250 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.973311 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmpqb\" (UniqueName: \"kubernetes.io/projected/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-kube-api-access-zmpqb\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.063674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerDied","Data":"c0edbcda9c3f7b679705f2855bc92f1439d6bbda0a5757f9bb8b30442e913f92"} Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.063725 4722 scope.go:117] "RemoveContainer" containerID="9dd75205c4c5942d39d1d1be74597216ea9be942ec22ace5e67be32d5d10e06e" Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.063726 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.089869 4722 scope.go:117] "RemoveContainer" containerID="0c6f6371a72fc2e229b928696e3e29e23c57574c3cfe1a40dace460a5209963e" Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.124324 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j8fkd"] Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.140448 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j8fkd"] Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.150336 4722 scope.go:117] "RemoveContainer" containerID="bda194e1ad7e8f4a41c3bef05958c52f05c21cb2d8a911235c451cf4aeb52e6b" Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.162730 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" path="/var/lib/kubelet/pods/3bd5fe57-e4ed-4f01-b933-8d85a6abb368/volumes" Feb 26 20:32:38 crc kubenswrapper[4722]: I0226 20:32:38.155499 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:32:38 crc kubenswrapper[4722]: E0226 20:32:38.156911 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:32:50 crc kubenswrapper[4722]: I0226 20:32:50.146511 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:32:50 crc kubenswrapper[4722]: E0226 20:32:50.147289 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:33:04 crc kubenswrapper[4722]: I0226 20:33:04.146334 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:33:04 crc kubenswrapper[4722]: E0226 20:33:04.147242 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:33:16 crc kubenswrapper[4722]: I0226 20:33:16.146248 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:33:16 crc kubenswrapper[4722]: E0226 20:33:16.147240 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:33:28 crc kubenswrapper[4722]: I0226 20:33:28.152275 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:33:28 crc kubenswrapper[4722]: E0226 20:33:28.152911 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:33:38 crc kubenswrapper[4722]: I0226 20:33:38.784775 4722 generic.go:334] "Generic (PLEG): container finished" podID="32f8d32f-af41-44a8-a252-50bdabeeab06" containerID="1666de71d0bb791ce023cf0612a1d0fdcfa096bae58628429c26a2d0694817b0" exitCode=0 Feb 26 20:33:38 crc kubenswrapper[4722]: I0226 20:33:38.785043 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" event={"ID":"32f8d32f-af41-44a8-a252-50bdabeeab06","Type":"ContainerDied","Data":"1666de71d0bb791ce023cf0612a1d0fdcfa096bae58628429c26a2d0694817b0"} Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.306641 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.404814 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-combined-ca-bundle\") pod \"32f8d32f-af41-44a8-a252-50bdabeeab06\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.404914 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-ssh-key-openstack-edpm-ipam\") pod \"32f8d32f-af41-44a8-a252-50bdabeeab06\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.405037 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-inventory\") pod \"32f8d32f-af41-44a8-a252-50bdabeeab06\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.405156 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqbbv\" (UniqueName: \"kubernetes.io/projected/32f8d32f-af41-44a8-a252-50bdabeeab06-kube-api-access-zqbbv\") pod \"32f8d32f-af41-44a8-a252-50bdabeeab06\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.405206 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-secret-0\") pod \"32f8d32f-af41-44a8-a252-50bdabeeab06\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.410768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32f8d32f-af41-44a8-a252-50bdabeeab06-kube-api-access-zqbbv" (OuterVolumeSpecName: "kube-api-access-zqbbv") pod "32f8d32f-af41-44a8-a252-50bdabeeab06" (UID: "32f8d32f-af41-44a8-a252-50bdabeeab06"). InnerVolumeSpecName "kube-api-access-zqbbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.411870 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "32f8d32f-af41-44a8-a252-50bdabeeab06" (UID: "32f8d32f-af41-44a8-a252-50bdabeeab06"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.433928 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "32f8d32f-af41-44a8-a252-50bdabeeab06" (UID: "32f8d32f-af41-44a8-a252-50bdabeeab06"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.434427 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-inventory" (OuterVolumeSpecName: "inventory") pod "32f8d32f-af41-44a8-a252-50bdabeeab06" (UID: "32f8d32f-af41-44a8-a252-50bdabeeab06"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.439168 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "32f8d32f-af41-44a8-a252-50bdabeeab06" (UID: "32f8d32f-af41-44a8-a252-50bdabeeab06"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.508464 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.508507 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.508518 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.508527 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqbbv\" (UniqueName: \"kubernetes.io/projected/32f8d32f-af41-44a8-a252-50bdabeeab06-kube-api-access-zqbbv\") on node \"crc\" DevicePath \"\"" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.508540 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.804625 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" event={"ID":"32f8d32f-af41-44a8-a252-50bdabeeab06","Type":"ContainerDied","Data":"079d6c1d13e59492caf154e78804d130906d8b000ee6b893375d937d1314b58b"} Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.804667 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="079d6c1d13e59492caf154e78804d130906d8b000ee6b893375d937d1314b58b" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.804690 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.898786 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9"] Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899297 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="extract-utilities" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899322 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="extract-utilities" Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899346 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="registry-server" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899356 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="registry-server" Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899371 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="extract-content" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899378 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="extract-content" Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899394 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="registry-server" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899403 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="registry-server" Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899430 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f8d32f-af41-44a8-a252-50bdabeeab06" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899439 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f8d32f-af41-44a8-a252-50bdabeeab06" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899463 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="extract-utilities" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899471 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="extract-utilities" Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899488 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="extract-content" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899497 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="extract-content" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899730 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="registry-server" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899753 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="registry-server" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899784 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f8d32f-af41-44a8-a252-50bdabeeab06" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.900875 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.905463 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.905600 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.905511 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.905713 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.905532 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.905537 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.906101 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.912503 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9"] Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.020742 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021242 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021625 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021655 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldpv9\" (UniqueName: \"kubernetes.io/projected/6d48f7c6-d170-4dea-9214-5324870b8311-kube-api-access-ldpv9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.022041 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.022070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6d48f7c6-d170-4dea-9214-5324870b8311-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.022178 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldpv9\" (UniqueName: \"kubernetes.io/projected/6d48f7c6-d170-4dea-9214-5324870b8311-kube-api-access-ldpv9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124484 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6d48f7c6-d170-4dea-9214-5324870b8311-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124553 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124613 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124677 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124736 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124784 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124806 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124874 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.125691 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6d48f7c6-d170-4dea-9214-5324870b8311-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.128483 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.128838 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.129069 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.129853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.130420 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.130941 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.131065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.133437 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.142556 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.144753 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldpv9\" (UniqueName: \"kubernetes.io/projected/6d48f7c6-d170-4dea-9214-5324870b8311-kube-api-access-ldpv9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.231652 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:42 crc kubenswrapper[4722]: I0226 20:33:41.744879 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9"] Feb 26 20:33:42 crc kubenswrapper[4722]: I0226 20:33:41.813547 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" event={"ID":"6d48f7c6-d170-4dea-9214-5324870b8311","Type":"ContainerStarted","Data":"f1e84fc187ee0dfd50694541726efbac1f6b8e913a6759039824967c2a5b7a97"} Feb 26 20:33:42 crc kubenswrapper[4722]: I0226 20:33:42.145922 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:33:42 crc kubenswrapper[4722]: E0226 20:33:42.146326 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:33:43 crc kubenswrapper[4722]: I0226 20:33:43.845928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" event={"ID":"6d48f7c6-d170-4dea-9214-5324870b8311","Type":"ContainerStarted","Data":"d05d88e987c2a18f712a5d617d63ed26dfd0901fb1d846476f8981245c10493e"} Feb 26 20:33:43 crc kubenswrapper[4722]: I0226 20:33:43.870744 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" podStartSLOduration=3.0900642720000002 podStartE2EDuration="3.870722955s" podCreationTimestamp="2026-02-26 20:33:40 +0000 UTC" firstStartedPulling="2026-02-26 20:33:41.74021462 +0000 UTC m=+2364.277182544" lastFinishedPulling="2026-02-26 20:33:42.520873303 +0000 UTC m=+2365.057841227" observedRunningTime="2026-02-26 20:33:43.867129988 +0000 UTC m=+2366.404097942" watchObservedRunningTime="2026-02-26 20:33:43.870722955 +0000 UTC m=+2366.407690889" Feb 26 20:33:56 crc kubenswrapper[4722]: I0226 20:33:56.146511 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:33:56 crc kubenswrapper[4722]: E0226 20:33:56.147964 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.162061 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535634-6nx8b"] Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.163720 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.167308 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.167789 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.167859 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.184915 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535634-6nx8b"] Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.264495 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28hgr\" (UniqueName: \"kubernetes.io/projected/4471a81e-751a-4e3c-b0b6-9e21c7106c2e-kube-api-access-28hgr\") pod \"auto-csr-approver-29535634-6nx8b\" (UID: \"4471a81e-751a-4e3c-b0b6-9e21c7106c2e\") " pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.366524 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28hgr\" (UniqueName: \"kubernetes.io/projected/4471a81e-751a-4e3c-b0b6-9e21c7106c2e-kube-api-access-28hgr\") pod \"auto-csr-approver-29535634-6nx8b\" (UID: \"4471a81e-751a-4e3c-b0b6-9e21c7106c2e\") " pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.385013 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28hgr\" (UniqueName: \"kubernetes.io/projected/4471a81e-751a-4e3c-b0b6-9e21c7106c2e-kube-api-access-28hgr\") pod \"auto-csr-approver-29535634-6nx8b\" (UID: \"4471a81e-751a-4e3c-b0b6-9e21c7106c2e\") " pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.484413 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.641697 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pcjv2"] Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.644355 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.670771 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pcjv2"] Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.776309 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fjpf\" (UniqueName: \"kubernetes.io/projected/ae450584-de96-4a61-aeeb-07581148e9be-kube-api-access-4fjpf\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.776648 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-utilities\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.776920 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-catalog-content\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.878901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fjpf\" (UniqueName: \"kubernetes.io/projected/ae450584-de96-4a61-aeeb-07581148e9be-kube-api-access-4fjpf\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.879019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-utilities\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.879048 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-catalog-content\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.879604 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-utilities\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.879716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-catalog-content\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.901317 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fjpf\" (UniqueName: \"kubernetes.io/projected/ae450584-de96-4a61-aeeb-07581148e9be-kube-api-access-4fjpf\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.984695 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:01 crc kubenswrapper[4722]: I0226 20:34:01.028422 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535634-6nx8b"] Feb 26 20:34:01 crc kubenswrapper[4722]: I0226 20:34:01.522775 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pcjv2"] Feb 26 20:34:02 crc kubenswrapper[4722]: I0226 20:34:02.025456 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae450584-de96-4a61-aeeb-07581148e9be" containerID="d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498" exitCode=0 Feb 26 20:34:02 crc kubenswrapper[4722]: I0226 20:34:02.026461 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerDied","Data":"d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498"} Feb 26 20:34:02 crc kubenswrapper[4722]: I0226 20:34:02.026496 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerStarted","Data":"069518085a75ed7209da17b41ec6cd53c47c868572bb5a2358ce82d4c98e45e4"} Feb 26 20:34:02 crc kubenswrapper[4722]: I0226 20:34:02.028681 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" event={"ID":"4471a81e-751a-4e3c-b0b6-9e21c7106c2e","Type":"ContainerStarted","Data":"bad0d3dfbb9ce07c9270e7aa6895ca9d45a034d492a0e3afe2f46863b71e331e"} Feb 26 20:34:03 crc kubenswrapper[4722]: I0226 20:34:03.038821 4722 generic.go:334] "Generic (PLEG): container finished" podID="4471a81e-751a-4e3c-b0b6-9e21c7106c2e" containerID="ce50671be6bb11eb5ee92e563839041719858f92f9c669a4138da9335247d8a2" exitCode=0 Feb 26 20:34:03 crc kubenswrapper[4722]: I0226 20:34:03.038875 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" event={"ID":"4471a81e-751a-4e3c-b0b6-9e21c7106c2e","Type":"ContainerDied","Data":"ce50671be6bb11eb5ee92e563839041719858f92f9c669a4138da9335247d8a2"} Feb 26 20:34:03 crc kubenswrapper[4722]: I0226 20:34:03.041405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerStarted","Data":"f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb"} Feb 26 20:34:04 crc kubenswrapper[4722]: I0226 20:34:04.572239 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:04 crc kubenswrapper[4722]: I0226 20:34:04.687059 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28hgr\" (UniqueName: \"kubernetes.io/projected/4471a81e-751a-4e3c-b0b6-9e21c7106c2e-kube-api-access-28hgr\") pod \"4471a81e-751a-4e3c-b0b6-9e21c7106c2e\" (UID: \"4471a81e-751a-4e3c-b0b6-9e21c7106c2e\") " Feb 26 20:34:04 crc kubenswrapper[4722]: I0226 20:34:04.694552 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4471a81e-751a-4e3c-b0b6-9e21c7106c2e-kube-api-access-28hgr" (OuterVolumeSpecName: "kube-api-access-28hgr") pod "4471a81e-751a-4e3c-b0b6-9e21c7106c2e" (UID: "4471a81e-751a-4e3c-b0b6-9e21c7106c2e"). InnerVolumeSpecName "kube-api-access-28hgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:34:04 crc kubenswrapper[4722]: I0226 20:34:04.790264 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28hgr\" (UniqueName: \"kubernetes.io/projected/4471a81e-751a-4e3c-b0b6-9e21c7106c2e-kube-api-access-28hgr\") on node \"crc\" DevicePath \"\"" Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.063786 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae450584-de96-4a61-aeeb-07581148e9be" containerID="f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb" exitCode=0 Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.063856 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerDied","Data":"f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb"} Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.065527 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.065517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" event={"ID":"4471a81e-751a-4e3c-b0b6-9e21c7106c2e","Type":"ContainerDied","Data":"bad0d3dfbb9ce07c9270e7aa6895ca9d45a034d492a0e3afe2f46863b71e331e"} Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.065591 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bad0d3dfbb9ce07c9270e7aa6895ca9d45a034d492a0e3afe2f46863b71e331e" Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.656575 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535628-vgvfh"] Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.666890 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535628-vgvfh"] Feb 26 20:34:06 crc kubenswrapper[4722]: I0226 20:34:06.075552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerStarted","Data":"e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3"} Feb 26 20:34:06 crc kubenswrapper[4722]: I0226 20:34:06.096028 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pcjv2" podStartSLOduration=2.686160955 podStartE2EDuration="6.096005879s" podCreationTimestamp="2026-02-26 20:34:00 +0000 UTC" firstStartedPulling="2026-02-26 20:34:02.027375389 +0000 UTC m=+2384.564343313" lastFinishedPulling="2026-02-26 20:34:05.437220323 +0000 UTC m=+2387.974188237" observedRunningTime="2026-02-26 20:34:06.090154551 +0000 UTC m=+2388.627122485" watchObservedRunningTime="2026-02-26 20:34:06.096005879 +0000 UTC m=+2388.632973803" Feb 26 20:34:06 crc kubenswrapper[4722]: I0226 20:34:06.157336 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc1f2f35-9607-4719-993b-8678440d3a0b" path="/var/lib/kubelet/pods/bc1f2f35-9607-4719-993b-8678440d3a0b/volumes" Feb 26 20:34:09 crc kubenswrapper[4722]: I0226 20:34:09.146168 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:34:09 crc kubenswrapper[4722]: E0226 20:34:09.146841 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:34:10 crc kubenswrapper[4722]: I0226 20:34:10.986770 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:10 crc kubenswrapper[4722]: I0226 20:34:10.987181 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:11 crc kubenswrapper[4722]: I0226 20:34:11.047554 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:11 crc kubenswrapper[4722]: I0226 20:34:11.175315 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:11 crc kubenswrapper[4722]: I0226 20:34:11.290237 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pcjv2"] Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.138851 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pcjv2" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="registry-server" containerID="cri-o://e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3" gracePeriod=2 Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.720993 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.778270 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-utilities\") pod \"ae450584-de96-4a61-aeeb-07581148e9be\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.782371 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-utilities" (OuterVolumeSpecName: "utilities") pod "ae450584-de96-4a61-aeeb-07581148e9be" (UID: "ae450584-de96-4a61-aeeb-07581148e9be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.782441 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-catalog-content\") pod \"ae450584-de96-4a61-aeeb-07581148e9be\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.782786 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fjpf\" (UniqueName: \"kubernetes.io/projected/ae450584-de96-4a61-aeeb-07581148e9be-kube-api-access-4fjpf\") pod \"ae450584-de96-4a61-aeeb-07581148e9be\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.785025 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.793496 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae450584-de96-4a61-aeeb-07581148e9be-kube-api-access-4fjpf" (OuterVolumeSpecName: "kube-api-access-4fjpf") pod "ae450584-de96-4a61-aeeb-07581148e9be" (UID: "ae450584-de96-4a61-aeeb-07581148e9be"). InnerVolumeSpecName "kube-api-access-4fjpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.815629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae450584-de96-4a61-aeeb-07581148e9be" (UID: "ae450584-de96-4a61-aeeb-07581148e9be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.887812 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.887841 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fjpf\" (UniqueName: \"kubernetes.io/projected/ae450584-de96-4a61-aeeb-07581148e9be-kube-api-access-4fjpf\") on node \"crc\" DevicePath \"\"" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.150291 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae450584-de96-4a61-aeeb-07581148e9be" containerID="e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3" exitCode=0 Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.150391 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.155837 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerDied","Data":"e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3"} Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.155879 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerDied","Data":"069518085a75ed7209da17b41ec6cd53c47c868572bb5a2358ce82d4c98e45e4"} Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.155899 4722 scope.go:117] "RemoveContainer" containerID="e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.187415 4722 scope.go:117] "RemoveContainer" containerID="f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.192334 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pcjv2"] Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.202392 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pcjv2"] Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.210948 4722 scope.go:117] "RemoveContainer" containerID="d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.265434 4722 scope.go:117] "RemoveContainer" containerID="e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3" Feb 26 20:34:14 crc kubenswrapper[4722]: E0226 20:34:14.266012 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3\": container with ID starting with e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3 not found: ID does not exist" containerID="e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.266098 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3"} err="failed to get container status \"e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3\": rpc error: code = NotFound desc = could not find container \"e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3\": container with ID starting with e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3 not found: ID does not exist" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.266126 4722 scope.go:117] "RemoveContainer" containerID="f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb" Feb 26 20:34:14 crc kubenswrapper[4722]: E0226 20:34:14.266450 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb\": container with ID starting with f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb not found: ID does not exist" containerID="f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.266477 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb"} err="failed to get container status \"f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb\": rpc error: code = NotFound desc = could not find container \"f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb\": container with ID starting with f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb not found: ID does not exist" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.266492 4722 scope.go:117] "RemoveContainer" containerID="d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498" Feb 26 20:34:14 crc kubenswrapper[4722]: E0226 20:34:14.266848 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498\": container with ID starting with d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498 not found: ID does not exist" containerID="d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.266868 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498"} err="failed to get container status \"d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498\": rpc error: code = NotFound desc = could not find container \"d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498\": container with ID starting with d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498 not found: ID does not exist" Feb 26 20:34:16 crc kubenswrapper[4722]: I0226 20:34:16.160039 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae450584-de96-4a61-aeeb-07581148e9be" path="/var/lib/kubelet/pods/ae450584-de96-4a61-aeeb-07581148e9be/volumes" Feb 26 20:34:24 crc kubenswrapper[4722]: I0226 20:34:24.146949 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:34:24 crc kubenswrapper[4722]: E0226 20:34:24.147798 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:34:27 crc kubenswrapper[4722]: I0226 20:34:27.741108 4722 scope.go:117] "RemoveContainer" containerID="c1cebce7b43ab6f2d08bae4c675ad61cbdae2db86711e81187ac5336a92a697c" Feb 26 20:34:38 crc kubenswrapper[4722]: I0226 20:34:38.157378 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:34:38 crc kubenswrapper[4722]: E0226 20:34:38.158086 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:34:49 crc kubenswrapper[4722]: I0226 20:34:49.146586 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:34:49 crc kubenswrapper[4722]: E0226 20:34:49.166705 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:35:00 crc kubenswrapper[4722]: I0226 20:35:00.146169 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:35:00 crc kubenswrapper[4722]: E0226 20:35:00.146836 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:35:14 crc kubenswrapper[4722]: I0226 20:35:14.146481 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:35:14 crc kubenswrapper[4722]: E0226 20:35:14.147122 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:35:26 crc kubenswrapper[4722]: I0226 20:35:26.146284 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:35:26 crc kubenswrapper[4722]: E0226 20:35:26.147106 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:35:39 crc kubenswrapper[4722]: I0226 20:35:39.145970 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:35:39 crc kubenswrapper[4722]: E0226 20:35:39.146667 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:35:51 crc kubenswrapper[4722]: I0226 20:35:51.146491 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:35:51 crc kubenswrapper[4722]: E0226 20:35:51.147423 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:35:59 crc kubenswrapper[4722]: I0226 20:35:59.134831 4722 generic.go:334] "Generic (PLEG): container finished" podID="6d48f7c6-d170-4dea-9214-5324870b8311" containerID="d05d88e987c2a18f712a5d617d63ed26dfd0901fb1d846476f8981245c10493e" exitCode=0 Feb 26 20:35:59 crc kubenswrapper[4722]: I0226 20:35:59.134944 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" event={"ID":"6d48f7c6-d170-4dea-9214-5324870b8311","Type":"ContainerDied","Data":"d05d88e987c2a18f712a5d617d63ed26dfd0901fb1d846476f8981245c10493e"} Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.144879 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535636-58m8m"] Feb 26 20:36:00 crc kubenswrapper[4722]: E0226 20:36:00.145617 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="extract-utilities" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.145631 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="extract-utilities" Feb 26 20:36:00 crc kubenswrapper[4722]: E0226 20:36:00.145653 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="extract-content" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.145660 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="extract-content" Feb 26 20:36:00 crc kubenswrapper[4722]: E0226 20:36:00.145677 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="registry-server" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.145683 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="registry-server" Feb 26 20:36:00 crc kubenswrapper[4722]: E0226 20:36:00.145698 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4471a81e-751a-4e3c-b0b6-9e21c7106c2e" containerName="oc" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.145704 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4471a81e-751a-4e3c-b0b6-9e21c7106c2e" containerName="oc" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.145879 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4471a81e-751a-4e3c-b0b6-9e21c7106c2e" containerName="oc" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.145896 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="registry-server" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.146985 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.153473 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.153656 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.153776 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.161528 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535636-58m8m"] Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.322513 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg7zs\" (UniqueName: \"kubernetes.io/projected/ba671314-b24c-4e8d-9f36-2d823e2233eb-kube-api-access-hg7zs\") pod \"auto-csr-approver-29535636-58m8m\" (UID: \"ba671314-b24c-4e8d-9f36-2d823e2233eb\") " pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.425638 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg7zs\" (UniqueName: \"kubernetes.io/projected/ba671314-b24c-4e8d-9f36-2d823e2233eb-kube-api-access-hg7zs\") pod \"auto-csr-approver-29535636-58m8m\" (UID: \"ba671314-b24c-4e8d-9f36-2d823e2233eb\") " pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.455356 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg7zs\" (UniqueName: \"kubernetes.io/projected/ba671314-b24c-4e8d-9f36-2d823e2233eb-kube-api-access-hg7zs\") pod \"auto-csr-approver-29535636-58m8m\" (UID: \"ba671314-b24c-4e8d-9f36-2d823e2233eb\") " pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.466882 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.618340 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.733882 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-2\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.733962 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-1\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734011 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldpv9\" (UniqueName: \"kubernetes.io/projected/6d48f7c6-d170-4dea-9214-5324870b8311-kube-api-access-ldpv9\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734062 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-inventory\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734088 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-ssh-key-openstack-edpm-ipam\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734186 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6d48f7c6-d170-4dea-9214-5324870b8311-nova-extra-config-0\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734258 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-1\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734296 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-combined-ca-bundle\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734351 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-3\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734388 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-0\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734455 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-0\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.747923 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d48f7c6-d170-4dea-9214-5324870b8311-kube-api-access-ldpv9" (OuterVolumeSpecName: "kube-api-access-ldpv9") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "kube-api-access-ldpv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.755469 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.774609 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.776413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.777614 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.783669 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-inventory" (OuterVolumeSpecName: "inventory") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.786334 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d48f7c6-d170-4dea-9214-5324870b8311-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.799707 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.800893 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.801263 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.808403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836801 4722 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836839 4722 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836849 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836859 4722 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836871 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836882 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836891 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836901 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldpv9\" (UniqueName: \"kubernetes.io/projected/6d48f7c6-d170-4dea-9214-5324870b8311-kube-api-access-ldpv9\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836911 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836921 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836934 4722 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6d48f7c6-d170-4dea-9214-5324870b8311-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.931190 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535636-58m8m"] Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.152753 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.152751 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" event={"ID":"6d48f7c6-d170-4dea-9214-5324870b8311","Type":"ContainerDied","Data":"f1e84fc187ee0dfd50694541726efbac1f6b8e913a6759039824967c2a5b7a97"} Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.152850 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1e84fc187ee0dfd50694541726efbac1f6b8e913a6759039824967c2a5b7a97" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.153763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535636-58m8m" event={"ID":"ba671314-b24c-4e8d-9f36-2d823e2233eb","Type":"ContainerStarted","Data":"1a78de3c89f0dffea72251a13210a4cde74f4086f4544325a3f6204bd88f22e6"} Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.257077 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq"] Feb 26 20:36:01 crc kubenswrapper[4722]: E0226 20:36:01.257785 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d48f7c6-d170-4dea-9214-5324870b8311" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.257801 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d48f7c6-d170-4dea-9214-5324870b8311" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.257999 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d48f7c6-d170-4dea-9214-5324870b8311" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.258696 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.269272 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.269683 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.269809 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.270574 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.271070 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.298761 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq"] Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.346892 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.346967 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.347071 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.347128 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvn89\" (UniqueName: \"kubernetes.io/projected/da1f8648-e221-4b8e-8691-5e88fc460998-kube-api-access-vvn89\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.347247 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.347284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.347380 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449498 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449585 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449774 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449851 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvn89\" (UniqueName: \"kubernetes.io/projected/da1f8648-e221-4b8e-8691-5e88fc460998-kube-api-access-vvn89\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449938 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449990 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.454681 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.454774 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.455111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.455325 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.456663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.457784 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.467817 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvn89\" (UniqueName: \"kubernetes.io/projected/da1f8648-e221-4b8e-8691-5e88fc460998-kube-api-access-vvn89\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.581429 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:02 crc kubenswrapper[4722]: I0226 20:36:02.146829 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:36:02 crc kubenswrapper[4722]: E0226 20:36:02.147344 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:36:02 crc kubenswrapper[4722]: I0226 20:36:02.167358 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535636-58m8m" event={"ID":"ba671314-b24c-4e8d-9f36-2d823e2233eb","Type":"ContainerStarted","Data":"66d98bf46bc739f50a9864ea9af2e2f18fcee898cb488e405bc9b2d0ead48143"} Feb 26 20:36:02 crc kubenswrapper[4722]: W0226 20:36:02.172910 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda1f8648_e221_4b8e_8691_5e88fc460998.slice/crio-3b9d79df5bc2eef19274fe5c742d632539ffef7af64402bfddfc5ed3e64c9365 WatchSource:0}: Error finding container 3b9d79df5bc2eef19274fe5c742d632539ffef7af64402bfddfc5ed3e64c9365: Status 404 returned error can't find the container with id 3b9d79df5bc2eef19274fe5c742d632539ffef7af64402bfddfc5ed3e64c9365 Feb 26 20:36:02 crc kubenswrapper[4722]: I0226 20:36:02.176391 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq"] Feb 26 20:36:02 crc kubenswrapper[4722]: I0226 20:36:02.188451 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535636-58m8m" podStartSLOduration=1.3296123180000001 podStartE2EDuration="2.188427481s" podCreationTimestamp="2026-02-26 20:36:00 +0000 UTC" firstStartedPulling="2026-02-26 20:36:00.93127487 +0000 UTC m=+2503.468242794" lastFinishedPulling="2026-02-26 20:36:01.790090033 +0000 UTC m=+2504.327057957" observedRunningTime="2026-02-26 20:36:02.183883468 +0000 UTC m=+2504.720851392" watchObservedRunningTime="2026-02-26 20:36:02.188427481 +0000 UTC m=+2504.725395405" Feb 26 20:36:03 crc kubenswrapper[4722]: I0226 20:36:03.179654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" event={"ID":"da1f8648-e221-4b8e-8691-5e88fc460998","Type":"ContainerStarted","Data":"5186cedcd034831651ea7823fa4124ebaf5faeff9a2d9b65046ce5a98216b24e"} Feb 26 20:36:03 crc kubenswrapper[4722]: I0226 20:36:03.180106 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" event={"ID":"da1f8648-e221-4b8e-8691-5e88fc460998","Type":"ContainerStarted","Data":"3b9d79df5bc2eef19274fe5c742d632539ffef7af64402bfddfc5ed3e64c9365"} Feb 26 20:36:03 crc kubenswrapper[4722]: I0226 20:36:03.185020 4722 generic.go:334] "Generic (PLEG): container finished" podID="ba671314-b24c-4e8d-9f36-2d823e2233eb" containerID="66d98bf46bc739f50a9864ea9af2e2f18fcee898cb488e405bc9b2d0ead48143" exitCode=0 Feb 26 20:36:03 crc kubenswrapper[4722]: I0226 20:36:03.185063 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535636-58m8m" event={"ID":"ba671314-b24c-4e8d-9f36-2d823e2233eb","Type":"ContainerDied","Data":"66d98bf46bc739f50a9864ea9af2e2f18fcee898cb488e405bc9b2d0ead48143"} Feb 26 20:36:03 crc kubenswrapper[4722]: I0226 20:36:03.213674 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" podStartSLOduration=1.774775041 podStartE2EDuration="2.213638893s" podCreationTimestamp="2026-02-26 20:36:01 +0000 UTC" firstStartedPulling="2026-02-26 20:36:02.174807463 +0000 UTC m=+2504.711775387" lastFinishedPulling="2026-02-26 20:36:02.613671315 +0000 UTC m=+2505.150639239" observedRunningTime="2026-02-26 20:36:03.200278172 +0000 UTC m=+2505.737246126" watchObservedRunningTime="2026-02-26 20:36:03.213638893 +0000 UTC m=+2505.750606877" Feb 26 20:36:04 crc kubenswrapper[4722]: I0226 20:36:04.631858 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:04 crc kubenswrapper[4722]: I0226 20:36:04.717894 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg7zs\" (UniqueName: \"kubernetes.io/projected/ba671314-b24c-4e8d-9f36-2d823e2233eb-kube-api-access-hg7zs\") pod \"ba671314-b24c-4e8d-9f36-2d823e2233eb\" (UID: \"ba671314-b24c-4e8d-9f36-2d823e2233eb\") " Feb 26 20:36:04 crc kubenswrapper[4722]: I0226 20:36:04.723633 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba671314-b24c-4e8d-9f36-2d823e2233eb-kube-api-access-hg7zs" (OuterVolumeSpecName: "kube-api-access-hg7zs") pod "ba671314-b24c-4e8d-9f36-2d823e2233eb" (UID: "ba671314-b24c-4e8d-9f36-2d823e2233eb"). InnerVolumeSpecName "kube-api-access-hg7zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:36:04 crc kubenswrapper[4722]: I0226 20:36:04.823431 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg7zs\" (UniqueName: \"kubernetes.io/projected/ba671314-b24c-4e8d-9f36-2d823e2233eb-kube-api-access-hg7zs\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:05 crc kubenswrapper[4722]: I0226 20:36:05.203500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535636-58m8m" event={"ID":"ba671314-b24c-4e8d-9f36-2d823e2233eb","Type":"ContainerDied","Data":"1a78de3c89f0dffea72251a13210a4cde74f4086f4544325a3f6204bd88f22e6"} Feb 26 20:36:05 crc kubenswrapper[4722]: I0226 20:36:05.203814 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a78de3c89f0dffea72251a13210a4cde74f4086f4544325a3f6204bd88f22e6" Feb 26 20:36:05 crc kubenswrapper[4722]: I0226 20:36:05.203617 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:05 crc kubenswrapper[4722]: I0226 20:36:05.265911 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535630-cqvlb"] Feb 26 20:36:05 crc kubenswrapper[4722]: I0226 20:36:05.275110 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535630-cqvlb"] Feb 26 20:36:06 crc kubenswrapper[4722]: I0226 20:36:06.162008 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60589e31-13a5-410b-926f-511d262459da" path="/var/lib/kubelet/pods/60589e31-13a5-410b-926f-511d262459da/volumes" Feb 26 20:36:13 crc kubenswrapper[4722]: I0226 20:36:13.146723 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:36:13 crc kubenswrapper[4722]: E0226 20:36:13.147664 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:36:25 crc kubenswrapper[4722]: I0226 20:36:25.146081 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:36:25 crc kubenswrapper[4722]: E0226 20:36:25.147203 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:36:27 crc kubenswrapper[4722]: I0226 20:36:27.857507 4722 scope.go:117] "RemoveContainer" containerID="d3b55ba26e0f272a7f8435619915e8cb32ff47a5c94e9bbf107e40479e66543f" Feb 26 20:36:38 crc kubenswrapper[4722]: I0226 20:36:38.155866 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:36:38 crc kubenswrapper[4722]: E0226 20:36:38.158011 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:36:49 crc kubenswrapper[4722]: I0226 20:36:49.146909 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:36:49 crc kubenswrapper[4722]: E0226 20:36:49.147811 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:37:01 crc kubenswrapper[4722]: I0226 20:37:01.145867 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:37:01 crc kubenswrapper[4722]: I0226 20:37:01.800758 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"fa56aa146aca89a64c60a5624b26de62c5d06783635e422c7b603bf29c2911a6"} Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.144820 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535638-t5gbq"] Feb 26 20:38:00 crc kubenswrapper[4722]: E0226 20:38:00.145816 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba671314-b24c-4e8d-9f36-2d823e2233eb" containerName="oc" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.145830 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba671314-b24c-4e8d-9f36-2d823e2233eb" containerName="oc" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.146048 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba671314-b24c-4e8d-9f36-2d823e2233eb" containerName="oc" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.147021 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.149651 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.149834 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.149957 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.173950 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535638-t5gbq"] Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.279777 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xvm4\" (UniqueName: \"kubernetes.io/projected/503976e9-dfb6-46c7-96af-9e53160418ac-kube-api-access-8xvm4\") pod \"auto-csr-approver-29535638-t5gbq\" (UID: \"503976e9-dfb6-46c7-96af-9e53160418ac\") " pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.382846 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xvm4\" (UniqueName: \"kubernetes.io/projected/503976e9-dfb6-46c7-96af-9e53160418ac-kube-api-access-8xvm4\") pod \"auto-csr-approver-29535638-t5gbq\" (UID: \"503976e9-dfb6-46c7-96af-9e53160418ac\") " pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.401640 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xvm4\" (UniqueName: \"kubernetes.io/projected/503976e9-dfb6-46c7-96af-9e53160418ac-kube-api-access-8xvm4\") pod \"auto-csr-approver-29535638-t5gbq\" (UID: \"503976e9-dfb6-46c7-96af-9e53160418ac\") " pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.473759 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.913701 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535638-t5gbq"] Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.917612 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:38:01 crc kubenswrapper[4722]: I0226 20:38:01.373683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" event={"ID":"503976e9-dfb6-46c7-96af-9e53160418ac","Type":"ContainerStarted","Data":"856e905d5a54b4b59ca4bc7041691247dbe7f57d3921a0a8b4babe0143b3f13b"} Feb 26 20:38:02 crc kubenswrapper[4722]: I0226 20:38:02.384637 4722 generic.go:334] "Generic (PLEG): container finished" podID="503976e9-dfb6-46c7-96af-9e53160418ac" containerID="70359eed1bd1f6327f64f1caf5e809aae473db4d69d64afe8d518f0482e5fe64" exitCode=0 Feb 26 20:38:02 crc kubenswrapper[4722]: I0226 20:38:02.384855 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" event={"ID":"503976e9-dfb6-46c7-96af-9e53160418ac","Type":"ContainerDied","Data":"70359eed1bd1f6327f64f1caf5e809aae473db4d69d64afe8d518f0482e5fe64"} Feb 26 20:38:03 crc kubenswrapper[4722]: I0226 20:38:03.870690 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.049054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xvm4\" (UniqueName: \"kubernetes.io/projected/503976e9-dfb6-46c7-96af-9e53160418ac-kube-api-access-8xvm4\") pod \"503976e9-dfb6-46c7-96af-9e53160418ac\" (UID: \"503976e9-dfb6-46c7-96af-9e53160418ac\") " Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.056398 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/503976e9-dfb6-46c7-96af-9e53160418ac-kube-api-access-8xvm4" (OuterVolumeSpecName: "kube-api-access-8xvm4") pod "503976e9-dfb6-46c7-96af-9e53160418ac" (UID: "503976e9-dfb6-46c7-96af-9e53160418ac"). InnerVolumeSpecName "kube-api-access-8xvm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.152065 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xvm4\" (UniqueName: \"kubernetes.io/projected/503976e9-dfb6-46c7-96af-9e53160418ac-kube-api-access-8xvm4\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.415053 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" event={"ID":"503976e9-dfb6-46c7-96af-9e53160418ac","Type":"ContainerDied","Data":"856e905d5a54b4b59ca4bc7041691247dbe7f57d3921a0a8b4babe0143b3f13b"} Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.415127 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856e905d5a54b4b59ca4bc7041691247dbe7f57d3921a0a8b4babe0143b3f13b" Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.415286 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.965667 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535632-hsvqm"] Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.976194 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535632-hsvqm"] Feb 26 20:38:06 crc kubenswrapper[4722]: I0226 20:38:06.158223 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da7ba56d-affb-4cc4-ba3e-d43c0265d472" path="/var/lib/kubelet/pods/da7ba56d-affb-4cc4-ba3e-d43c0265d472/volumes" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.227719 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7pkc9"] Feb 26 20:38:16 crc kubenswrapper[4722]: E0226 20:38:16.231152 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="503976e9-dfb6-46c7-96af-9e53160418ac" containerName="oc" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.231281 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="503976e9-dfb6-46c7-96af-9e53160418ac" containerName="oc" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.231587 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="503976e9-dfb6-46c7-96af-9e53160418ac" containerName="oc" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.233866 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.241379 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pkc9"] Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.311055 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-catalog-content\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.311196 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94l52\" (UniqueName: \"kubernetes.io/projected/323cd04c-a631-46ed-a2cb-2f97f0a6a471-kube-api-access-94l52\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.311333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-utilities\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.413358 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-utilities\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.413516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-catalog-content\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.413589 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94l52\" (UniqueName: \"kubernetes.io/projected/323cd04c-a631-46ed-a2cb-2f97f0a6a471-kube-api-access-94l52\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.414087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-catalog-content\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.414184 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-utilities\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.432247 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94l52\" (UniqueName: \"kubernetes.io/projected/323cd04c-a631-46ed-a2cb-2f97f0a6a471-kube-api-access-94l52\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.553602 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:17 crc kubenswrapper[4722]: I0226 20:38:17.036869 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pkc9"] Feb 26 20:38:17 crc kubenswrapper[4722]: I0226 20:38:17.546018 4722 generic.go:334] "Generic (PLEG): container finished" podID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerID="639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af" exitCode=0 Feb 26 20:38:17 crc kubenswrapper[4722]: I0226 20:38:17.546066 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerDied","Data":"639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af"} Feb 26 20:38:17 crc kubenswrapper[4722]: I0226 20:38:17.546094 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerStarted","Data":"24784ed91703128681052c6e00ce5ec677cb9a182fcee5d011f47a9dd817fab6"} Feb 26 20:38:18 crc kubenswrapper[4722]: I0226 20:38:18.558251 4722 generic.go:334] "Generic (PLEG): container finished" podID="da1f8648-e221-4b8e-8691-5e88fc460998" containerID="5186cedcd034831651ea7823fa4124ebaf5faeff9a2d9b65046ce5a98216b24e" exitCode=0 Feb 26 20:38:18 crc kubenswrapper[4722]: I0226 20:38:18.558301 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" event={"ID":"da1f8648-e221-4b8e-8691-5e88fc460998","Type":"ContainerDied","Data":"5186cedcd034831651ea7823fa4124ebaf5faeff9a2d9b65046ce5a98216b24e"} Feb 26 20:38:18 crc kubenswrapper[4722]: I0226 20:38:18.561935 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerStarted","Data":"95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00"} Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.279325 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.300945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ssh-key-openstack-edpm-ipam\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.301051 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-0\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.301138 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-1\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.301279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-telemetry-combined-ca-bundle\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.301303 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-2\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.301375 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-inventory\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.301411 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvn89\" (UniqueName: \"kubernetes.io/projected/da1f8648-e221-4b8e-8691-5e88fc460998-kube-api-access-vvn89\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.325962 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1f8648-e221-4b8e-8691-5e88fc460998-kube-api-access-vvn89" (OuterVolumeSpecName: "kube-api-access-vvn89") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "kube-api-access-vvn89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.334458 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.339516 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.347044 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.365637 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.374288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.383312 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-inventory" (OuterVolumeSpecName: "inventory") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403884 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403914 4722 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403924 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403955 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403971 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvn89\" (UniqueName: \"kubernetes.io/projected/da1f8648-e221-4b8e-8691-5e88fc460998-kube-api-access-vvn89\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403981 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403989 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.584970 4722 generic.go:334] "Generic (PLEG): container finished" podID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerID="95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00" exitCode=0 Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.585048 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerDied","Data":"95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00"} Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.588721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" event={"ID":"da1f8648-e221-4b8e-8691-5e88fc460998","Type":"ContainerDied","Data":"3b9d79df5bc2eef19274fe5c742d632539ffef7af64402bfddfc5ed3e64c9365"} Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.588758 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b9d79df5bc2eef19274fe5c742d632539ffef7af64402bfddfc5ed3e64c9365" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.588813 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:38:22 crc kubenswrapper[4722]: I0226 20:38:22.613100 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerStarted","Data":"8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b"} Feb 26 20:38:22 crc kubenswrapper[4722]: I0226 20:38:22.637772 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7pkc9" podStartSLOduration=2.754481469 podStartE2EDuration="6.637752021s" podCreationTimestamp="2026-02-26 20:38:16 +0000 UTC" firstStartedPulling="2026-02-26 20:38:17.547417992 +0000 UTC m=+2640.084385916" lastFinishedPulling="2026-02-26 20:38:21.430688544 +0000 UTC m=+2643.967656468" observedRunningTime="2026-02-26 20:38:22.63035463 +0000 UTC m=+2645.167322554" watchObservedRunningTime="2026-02-26 20:38:22.637752021 +0000 UTC m=+2645.174719955" Feb 26 20:38:26 crc kubenswrapper[4722]: I0226 20:38:26.554017 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:26 crc kubenswrapper[4722]: I0226 20:38:26.554614 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:27 crc kubenswrapper[4722]: I0226 20:38:27.600456 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7pkc9" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="registry-server" probeResult="failure" output=< Feb 26 20:38:27 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:38:27 crc kubenswrapper[4722]: > Feb 26 20:38:27 crc kubenswrapper[4722]: I0226 20:38:27.976816 4722 scope.go:117] "RemoveContainer" containerID="f09f99460ab7d3a2048c5dab9049e6932d194573ee589cfabc4fe12c1a81582a" Feb 26 20:38:36 crc kubenswrapper[4722]: I0226 20:38:36.599878 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:36 crc kubenswrapper[4722]: I0226 20:38:36.648599 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:36 crc kubenswrapper[4722]: I0226 20:38:36.838022 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7pkc9"] Feb 26 20:38:37 crc kubenswrapper[4722]: I0226 20:38:37.755371 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7pkc9" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="registry-server" containerID="cri-o://8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b" gracePeriod=2 Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.327886 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.518257 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-utilities\") pod \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.518474 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-catalog-content\") pod \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.518503 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94l52\" (UniqueName: \"kubernetes.io/projected/323cd04c-a631-46ed-a2cb-2f97f0a6a471-kube-api-access-94l52\") pod \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.519040 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-utilities" (OuterVolumeSpecName: "utilities") pod "323cd04c-a631-46ed-a2cb-2f97f0a6a471" (UID: "323cd04c-a631-46ed-a2cb-2f97f0a6a471"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.524755 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/323cd04c-a631-46ed-a2cb-2f97f0a6a471-kube-api-access-94l52" (OuterVolumeSpecName: "kube-api-access-94l52") pod "323cd04c-a631-46ed-a2cb-2f97f0a6a471" (UID: "323cd04c-a631-46ed-a2cb-2f97f0a6a471"). InnerVolumeSpecName "kube-api-access-94l52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.621068 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.621105 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94l52\" (UniqueName: \"kubernetes.io/projected/323cd04c-a631-46ed-a2cb-2f97f0a6a471-kube-api-access-94l52\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.659724 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "323cd04c-a631-46ed-a2cb-2f97f0a6a471" (UID: "323cd04c-a631-46ed-a2cb-2f97f0a6a471"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.723562 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.766260 4722 generic.go:334] "Generic (PLEG): container finished" podID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerID="8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b" exitCode=0 Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.766660 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.766677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerDied","Data":"8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b"} Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.767595 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerDied","Data":"24784ed91703128681052c6e00ce5ec677cb9a182fcee5d011f47a9dd817fab6"} Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.767619 4722 scope.go:117] "RemoveContainer" containerID="8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.788618 4722 scope.go:117] "RemoveContainer" containerID="95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.806328 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7pkc9"] Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.814116 4722 scope.go:117] "RemoveContainer" containerID="639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.816564 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7pkc9"] Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.858488 4722 scope.go:117] "RemoveContainer" containerID="8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b" Feb 26 20:38:38 crc kubenswrapper[4722]: E0226 20:38:38.858932 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b\": container with ID starting with 8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b not found: ID does not exist" containerID="8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.858981 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b"} err="failed to get container status \"8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b\": rpc error: code = NotFound desc = could not find container \"8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b\": container with ID starting with 8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b not found: ID does not exist" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.859003 4722 scope.go:117] "RemoveContainer" containerID="95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00" Feb 26 20:38:38 crc kubenswrapper[4722]: E0226 20:38:38.859559 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00\": container with ID starting with 95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00 not found: ID does not exist" containerID="95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.859583 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00"} err="failed to get container status \"95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00\": rpc error: code = NotFound desc = could not find container \"95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00\": container with ID starting with 95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00 not found: ID does not exist" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.859598 4722 scope.go:117] "RemoveContainer" containerID="639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af" Feb 26 20:38:38 crc kubenswrapper[4722]: E0226 20:38:38.859815 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af\": container with ID starting with 639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af not found: ID does not exist" containerID="639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.859847 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af"} err="failed to get container status \"639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af\": rpc error: code = NotFound desc = could not find container \"639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af\": container with ID starting with 639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af not found: ID does not exist" Feb 26 20:38:40 crc kubenswrapper[4722]: I0226 20:38:40.160573 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" path="/var/lib/kubelet/pods/323cd04c-a631-46ed-a2cb-2f97f0a6a471/volumes" Feb 26 20:39:23 crc kubenswrapper[4722]: I0226 20:39:23.486980 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:39:23 crc kubenswrapper[4722]: I0226 20:39:23.489886 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:39:53 crc kubenswrapper[4722]: I0226 20:39:53.487680 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:39:53 crc kubenswrapper[4722]: I0226 20:39:53.488328 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.158846 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535640-dvlm9"] Feb 26 20:40:00 crc kubenswrapper[4722]: E0226 20:40:00.159714 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="registry-server" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.159730 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="registry-server" Feb 26 20:40:00 crc kubenswrapper[4722]: E0226 20:40:00.159750 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="extract-utilities" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.159757 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="extract-utilities" Feb 26 20:40:00 crc kubenswrapper[4722]: E0226 20:40:00.159772 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="extract-content" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.159780 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="extract-content" Feb 26 20:40:00 crc kubenswrapper[4722]: E0226 20:40:00.159804 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1f8648-e221-4b8e-8691-5e88fc460998" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.159811 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1f8648-e221-4b8e-8691-5e88fc460998" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.161744 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="registry-server" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.161764 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1f8648-e221-4b8e-8691-5e88fc460998" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.162864 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535640-dvlm9"] Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.162943 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.168572 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.169030 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.170153 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.225331 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnd79\" (UniqueName: \"kubernetes.io/projected/d46cdb69-f149-44bc-bb3e-6f8b94e937c3-kube-api-access-pnd79\") pod \"auto-csr-approver-29535640-dvlm9\" (UID: \"d46cdb69-f149-44bc-bb3e-6f8b94e937c3\") " pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.326349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnd79\" (UniqueName: \"kubernetes.io/projected/d46cdb69-f149-44bc-bb3e-6f8b94e937c3-kube-api-access-pnd79\") pod \"auto-csr-approver-29535640-dvlm9\" (UID: \"d46cdb69-f149-44bc-bb3e-6f8b94e937c3\") " pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.351397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnd79\" (UniqueName: \"kubernetes.io/projected/d46cdb69-f149-44bc-bb3e-6f8b94e937c3-kube-api-access-pnd79\") pod \"auto-csr-approver-29535640-dvlm9\" (UID: \"d46cdb69-f149-44bc-bb3e-6f8b94e937c3\") " pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.488023 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:01 crc kubenswrapper[4722]: I0226 20:40:01.005666 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535640-dvlm9"] Feb 26 20:40:01 crc kubenswrapper[4722]: I0226 20:40:01.590737 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" event={"ID":"d46cdb69-f149-44bc-bb3e-6f8b94e937c3","Type":"ContainerStarted","Data":"c8b748d03e49fd35ae5cf869fc5b8e6c358d7769cf2c03dcee1b63b973782840"} Feb 26 20:40:03 crc kubenswrapper[4722]: I0226 20:40:03.612484 4722 generic.go:334] "Generic (PLEG): container finished" podID="d46cdb69-f149-44bc-bb3e-6f8b94e937c3" containerID="d1d12fedd8dee91b449932d270c358066711fb42aa8f2cbf91cf3dec9a137e05" exitCode=0 Feb 26 20:40:03 crc kubenswrapper[4722]: I0226 20:40:03.612972 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" event={"ID":"d46cdb69-f149-44bc-bb3e-6f8b94e937c3","Type":"ContainerDied","Data":"d1d12fedd8dee91b449932d270c358066711fb42aa8f2cbf91cf3dec9a137e05"} Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.175529 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.228844 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnd79\" (UniqueName: \"kubernetes.io/projected/d46cdb69-f149-44bc-bb3e-6f8b94e937c3-kube-api-access-pnd79\") pod \"d46cdb69-f149-44bc-bb3e-6f8b94e937c3\" (UID: \"d46cdb69-f149-44bc-bb3e-6f8b94e937c3\") " Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.234644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46cdb69-f149-44bc-bb3e-6f8b94e937c3-kube-api-access-pnd79" (OuterVolumeSpecName: "kube-api-access-pnd79") pod "d46cdb69-f149-44bc-bb3e-6f8b94e937c3" (UID: "d46cdb69-f149-44bc-bb3e-6f8b94e937c3"). InnerVolumeSpecName "kube-api-access-pnd79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.332069 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnd79\" (UniqueName: \"kubernetes.io/projected/d46cdb69-f149-44bc-bb3e-6f8b94e937c3-kube-api-access-pnd79\") on node \"crc\" DevicePath \"\"" Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.631597 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" event={"ID":"d46cdb69-f149-44bc-bb3e-6f8b94e937c3","Type":"ContainerDied","Data":"c8b748d03e49fd35ae5cf869fc5b8e6c358d7769cf2c03dcee1b63b973782840"} Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.631636 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8b748d03e49fd35ae5cf869fc5b8e6c358d7769cf2c03dcee1b63b973782840" Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.631647 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:06 crc kubenswrapper[4722]: I0226 20:40:06.246588 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535634-6nx8b"] Feb 26 20:40:06 crc kubenswrapper[4722]: I0226 20:40:06.260759 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535634-6nx8b"] Feb 26 20:40:08 crc kubenswrapper[4722]: I0226 20:40:08.168922 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4471a81e-751a-4e3c-b0b6-9e21c7106c2e" path="/var/lib/kubelet/pods/4471a81e-751a-4e3c-b0b6-9e21c7106c2e/volumes" Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.487707 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.488066 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.488104 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.488762 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa56aa146aca89a64c60a5624b26de62c5d06783635e422c7b603bf29c2911a6"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.488809 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://fa56aa146aca89a64c60a5624b26de62c5d06783635e422c7b603bf29c2911a6" gracePeriod=600 Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.797223 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="fa56aa146aca89a64c60a5624b26de62c5d06783635e422c7b603bf29c2911a6" exitCode=0 Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.797261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"fa56aa146aca89a64c60a5624b26de62c5d06783635e422c7b603bf29c2911a6"} Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.797303 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:40:24 crc kubenswrapper[4722]: I0226 20:40:24.812078 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1"} Feb 26 20:40:28 crc kubenswrapper[4722]: I0226 20:40:28.085896 4722 scope.go:117] "RemoveContainer" containerID="ce50671be6bb11eb5ee92e563839041719858f92f9c669a4138da9335247d8a2" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.159924 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535642-9v9kv"] Feb 26 20:42:00 crc kubenswrapper[4722]: E0226 20:42:00.160717 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46cdb69-f149-44bc-bb3e-6f8b94e937c3" containerName="oc" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.160728 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46cdb69-f149-44bc-bb3e-6f8b94e937c3" containerName="oc" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.160928 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46cdb69-f149-44bc-bb3e-6f8b94e937c3" containerName="oc" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.161725 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.164022 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.166765 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535642-9v9kv"] Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.168265 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.168538 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.200557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8bvw\" (UniqueName: \"kubernetes.io/projected/bcf44450-97f2-474b-abf8-9c306e6d5679-kube-api-access-t8bvw\") pod \"auto-csr-approver-29535642-9v9kv\" (UID: \"bcf44450-97f2-474b-abf8-9c306e6d5679\") " pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.303098 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8bvw\" (UniqueName: \"kubernetes.io/projected/bcf44450-97f2-474b-abf8-9c306e6d5679-kube-api-access-t8bvw\") pod \"auto-csr-approver-29535642-9v9kv\" (UID: \"bcf44450-97f2-474b-abf8-9c306e6d5679\") " pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.323959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8bvw\" (UniqueName: \"kubernetes.io/projected/bcf44450-97f2-474b-abf8-9c306e6d5679-kube-api-access-t8bvw\") pod \"auto-csr-approver-29535642-9v9kv\" (UID: \"bcf44450-97f2-474b-abf8-9c306e6d5679\") " pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.483007 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.931770 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535642-9v9kv"] Feb 26 20:42:00 crc kubenswrapper[4722]: W0226 20:42:00.939034 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcf44450_97f2_474b_abf8_9c306e6d5679.slice/crio-f042ea70a1bd9f6610045b6bf83029c3e93cb07a08f5cb1a82e08fba9277e3ff WatchSource:0}: Error finding container f042ea70a1bd9f6610045b6bf83029c3e93cb07a08f5cb1a82e08fba9277e3ff: Status 404 returned error can't find the container with id f042ea70a1bd9f6610045b6bf83029c3e93cb07a08f5cb1a82e08fba9277e3ff Feb 26 20:42:01 crc kubenswrapper[4722]: I0226 20:42:01.280735 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" event={"ID":"bcf44450-97f2-474b-abf8-9c306e6d5679","Type":"ContainerStarted","Data":"f042ea70a1bd9f6610045b6bf83029c3e93cb07a08f5cb1a82e08fba9277e3ff"} Feb 26 20:42:02 crc kubenswrapper[4722]: I0226 20:42:02.291305 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" event={"ID":"bcf44450-97f2-474b-abf8-9c306e6d5679","Type":"ContainerStarted","Data":"3f12f0d95667cd2f8d50bb9570c7de9cd1db62e57b4676452cb66f423535f60d"} Feb 26 20:42:02 crc kubenswrapper[4722]: I0226 20:42:02.312090 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" podStartSLOduration=1.291986517 podStartE2EDuration="2.312038068s" podCreationTimestamp="2026-02-26 20:42:00 +0000 UTC" firstStartedPulling="2026-02-26 20:42:00.942293414 +0000 UTC m=+2863.479261338" lastFinishedPulling="2026-02-26 20:42:01.962344965 +0000 UTC m=+2864.499312889" observedRunningTime="2026-02-26 20:42:02.303759363 +0000 UTC m=+2864.840727287" watchObservedRunningTime="2026-02-26 20:42:02.312038068 +0000 UTC m=+2864.849005992" Feb 26 20:42:03 crc kubenswrapper[4722]: I0226 20:42:03.301416 4722 generic.go:334] "Generic (PLEG): container finished" podID="bcf44450-97f2-474b-abf8-9c306e6d5679" containerID="3f12f0d95667cd2f8d50bb9570c7de9cd1db62e57b4676452cb66f423535f60d" exitCode=0 Feb 26 20:42:03 crc kubenswrapper[4722]: I0226 20:42:03.301485 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" event={"ID":"bcf44450-97f2-474b-abf8-9c306e6d5679","Type":"ContainerDied","Data":"3f12f0d95667cd2f8d50bb9570c7de9cd1db62e57b4676452cb66f423535f60d"} Feb 26 20:42:04 crc kubenswrapper[4722]: I0226 20:42:04.770351 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:04 crc kubenswrapper[4722]: I0226 20:42:04.916488 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8bvw\" (UniqueName: \"kubernetes.io/projected/bcf44450-97f2-474b-abf8-9c306e6d5679-kube-api-access-t8bvw\") pod \"bcf44450-97f2-474b-abf8-9c306e6d5679\" (UID: \"bcf44450-97f2-474b-abf8-9c306e6d5679\") " Feb 26 20:42:04 crc kubenswrapper[4722]: I0226 20:42:04.922008 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf44450-97f2-474b-abf8-9c306e6d5679-kube-api-access-t8bvw" (OuterVolumeSpecName: "kube-api-access-t8bvw") pod "bcf44450-97f2-474b-abf8-9c306e6d5679" (UID: "bcf44450-97f2-474b-abf8-9c306e6d5679"). InnerVolumeSpecName "kube-api-access-t8bvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:42:05 crc kubenswrapper[4722]: I0226 20:42:05.019488 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8bvw\" (UniqueName: \"kubernetes.io/projected/bcf44450-97f2-474b-abf8-9c306e6d5679-kube-api-access-t8bvw\") on node \"crc\" DevicePath \"\"" Feb 26 20:42:05 crc kubenswrapper[4722]: I0226 20:42:05.324407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" event={"ID":"bcf44450-97f2-474b-abf8-9c306e6d5679","Type":"ContainerDied","Data":"f042ea70a1bd9f6610045b6bf83029c3e93cb07a08f5cb1a82e08fba9277e3ff"} Feb 26 20:42:05 crc kubenswrapper[4722]: I0226 20:42:05.324454 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f042ea70a1bd9f6610045b6bf83029c3e93cb07a08f5cb1a82e08fba9277e3ff" Feb 26 20:42:05 crc kubenswrapper[4722]: I0226 20:42:05.324487 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:05 crc kubenswrapper[4722]: I0226 20:42:05.401528 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535636-58m8m"] Feb 26 20:42:05 crc kubenswrapper[4722]: I0226 20:42:05.409308 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535636-58m8m"] Feb 26 20:42:06 crc kubenswrapper[4722]: I0226 20:42:06.160427 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba671314-b24c-4e8d-9f36-2d823e2233eb" path="/var/lib/kubelet/pods/ba671314-b24c-4e8d-9f36-2d823e2233eb/volumes" Feb 26 20:42:23 crc kubenswrapper[4722]: I0226 20:42:23.487543 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:42:23 crc kubenswrapper[4722]: I0226 20:42:23.488030 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:42:28 crc kubenswrapper[4722]: I0226 20:42:28.184025 4722 scope.go:117] "RemoveContainer" containerID="66d98bf46bc739f50a9864ea9af2e2f18fcee898cb488e405bc9b2d0ead48143" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.488612 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9f4w8"] Feb 26 20:42:31 crc kubenswrapper[4722]: E0226 20:42:31.489782 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf44450-97f2-474b-abf8-9c306e6d5679" containerName="oc" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.489801 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf44450-97f2-474b-abf8-9c306e6d5679" containerName="oc" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.490078 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf44450-97f2-474b-abf8-9c306e6d5679" containerName="oc" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.492114 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.502525 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9f4w8"] Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.560830 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zchn\" (UniqueName: \"kubernetes.io/projected/35f9907b-527d-407c-9f61-2a163bdcdf40-kube-api-access-9zchn\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.561636 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-utilities\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.561844 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-catalog-content\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.664386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zchn\" (UniqueName: \"kubernetes.io/projected/35f9907b-527d-407c-9f61-2a163bdcdf40-kube-api-access-9zchn\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.664440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-utilities\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.664467 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-catalog-content\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.665169 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-catalog-content\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.665185 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-utilities\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.684619 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zchn\" (UniqueName: \"kubernetes.io/projected/35f9907b-527d-407c-9f61-2a163bdcdf40-kube-api-access-9zchn\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.818296 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:32 crc kubenswrapper[4722]: I0226 20:42:32.384937 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9f4w8"] Feb 26 20:42:32 crc kubenswrapper[4722]: I0226 20:42:32.568323 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerStarted","Data":"3c0e33446cf2e630a8e1358432f28a1c6d230783000cf3a267d0918f141303b1"} Feb 26 20:42:33 crc kubenswrapper[4722]: I0226 20:42:33.579810 4722 generic.go:334] "Generic (PLEG): container finished" podID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerID="100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8" exitCode=0 Feb 26 20:42:33 crc kubenswrapper[4722]: I0226 20:42:33.580228 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerDied","Data":"100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8"} Feb 26 20:42:34 crc kubenswrapper[4722]: I0226 20:42:34.594075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerStarted","Data":"bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa"} Feb 26 20:42:36 crc kubenswrapper[4722]: I0226 20:42:36.617271 4722 generic.go:334] "Generic (PLEG): container finished" podID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerID="bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa" exitCode=0 Feb 26 20:42:36 crc kubenswrapper[4722]: I0226 20:42:36.617350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerDied","Data":"bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa"} Feb 26 20:42:37 crc kubenswrapper[4722]: I0226 20:42:37.628807 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerStarted","Data":"a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6"} Feb 26 20:42:37 crc kubenswrapper[4722]: I0226 20:42:37.655400 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9f4w8" podStartSLOduration=3.2447646629999998 podStartE2EDuration="6.655379262s" podCreationTimestamp="2026-02-26 20:42:31 +0000 UTC" firstStartedPulling="2026-02-26 20:42:33.583987055 +0000 UTC m=+2896.120954979" lastFinishedPulling="2026-02-26 20:42:36.994601654 +0000 UTC m=+2899.531569578" observedRunningTime="2026-02-26 20:42:37.649103403 +0000 UTC m=+2900.186071357" watchObservedRunningTime="2026-02-26 20:42:37.655379262 +0000 UTC m=+2900.192347186" Feb 26 20:42:41 crc kubenswrapper[4722]: I0226 20:42:41.819270 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:41 crc kubenswrapper[4722]: I0226 20:42:41.819562 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:41 crc kubenswrapper[4722]: I0226 20:42:41.867435 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:42 crc kubenswrapper[4722]: I0226 20:42:42.743675 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:42 crc kubenswrapper[4722]: I0226 20:42:42.791640 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9f4w8"] Feb 26 20:42:44 crc kubenswrapper[4722]: I0226 20:42:44.701343 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9f4w8" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="registry-server" containerID="cri-o://a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6" gracePeriod=2 Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.217812 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.350650 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zchn\" (UniqueName: \"kubernetes.io/projected/35f9907b-527d-407c-9f61-2a163bdcdf40-kube-api-access-9zchn\") pod \"35f9907b-527d-407c-9f61-2a163bdcdf40\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.350757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-utilities\") pod \"35f9907b-527d-407c-9f61-2a163bdcdf40\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.350917 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-catalog-content\") pod \"35f9907b-527d-407c-9f61-2a163bdcdf40\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.351732 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-utilities" (OuterVolumeSpecName: "utilities") pod "35f9907b-527d-407c-9f61-2a163bdcdf40" (UID: "35f9907b-527d-407c-9f61-2a163bdcdf40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.359623 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f9907b-527d-407c-9f61-2a163bdcdf40-kube-api-access-9zchn" (OuterVolumeSpecName: "kube-api-access-9zchn") pod "35f9907b-527d-407c-9f61-2a163bdcdf40" (UID: "35f9907b-527d-407c-9f61-2a163bdcdf40"). InnerVolumeSpecName "kube-api-access-9zchn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.403677 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35f9907b-527d-407c-9f61-2a163bdcdf40" (UID: "35f9907b-527d-407c-9f61-2a163bdcdf40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.453175 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.453213 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.453228 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zchn\" (UniqueName: \"kubernetes.io/projected/35f9907b-527d-407c-9f61-2a163bdcdf40-kube-api-access-9zchn\") on node \"crc\" DevicePath \"\"" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.720310 4722 generic.go:334] "Generic (PLEG): container finished" podID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerID="a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6" exitCode=0 Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.720352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerDied","Data":"a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6"} Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.720375 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerDied","Data":"3c0e33446cf2e630a8e1358432f28a1c6d230783000cf3a267d0918f141303b1"} Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.720390 4722 scope.go:117] "RemoveContainer" containerID="a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.720500 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.757683 4722 scope.go:117] "RemoveContainer" containerID="bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.757854 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9f4w8"] Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.767461 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9f4w8"] Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.792548 4722 scope.go:117] "RemoveContainer" containerID="100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.840495 4722 scope.go:117] "RemoveContainer" containerID="a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6" Feb 26 20:42:45 crc kubenswrapper[4722]: E0226 20:42:45.840983 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6\": container with ID starting with a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6 not found: ID does not exist" containerID="a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.841082 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6"} err="failed to get container status \"a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6\": rpc error: code = NotFound desc = could not find container \"a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6\": container with ID starting with a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6 not found: ID does not exist" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.841189 4722 scope.go:117] "RemoveContainer" containerID="bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa" Feb 26 20:42:45 crc kubenswrapper[4722]: E0226 20:42:45.841601 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa\": container with ID starting with bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa not found: ID does not exist" containerID="bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.841709 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa"} err="failed to get container status \"bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa\": rpc error: code = NotFound desc = could not find container \"bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa\": container with ID starting with bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa not found: ID does not exist" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.841783 4722 scope.go:117] "RemoveContainer" containerID="100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8" Feb 26 20:42:45 crc kubenswrapper[4722]: E0226 20:42:45.842035 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8\": container with ID starting with 100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8 not found: ID does not exist" containerID="100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.842117 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8"} err="failed to get container status \"100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8\": rpc error: code = NotFound desc = could not find container \"100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8\": container with ID starting with 100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8 not found: ID does not exist" Feb 26 20:42:46 crc kubenswrapper[4722]: I0226 20:42:46.156273 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" path="/var/lib/kubelet/pods/35f9907b-527d-407c-9f61-2a163bdcdf40/volumes" Feb 26 20:42:53 crc kubenswrapper[4722]: I0226 20:42:53.486977 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:42:53 crc kubenswrapper[4722]: I0226 20:42:53.487528 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.975477 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:42:57 crc kubenswrapper[4722]: E0226 20:42:57.976783 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="extract-utilities" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.976800 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="extract-utilities" Feb 26 20:42:57 crc kubenswrapper[4722]: E0226 20:42:57.976842 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="registry-server" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.976850 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="registry-server" Feb 26 20:42:57 crc kubenswrapper[4722]: E0226 20:42:57.976865 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="extract-content" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.976873 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="extract-content" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.977119 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="registry-server" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.978893 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.990717 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.112412 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-utilities\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.112842 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkcqg\" (UniqueName: \"kubernetes.io/projected/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-kube-api-access-xkcqg\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.112936 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-catalog-content\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.215581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkcqg\" (UniqueName: \"kubernetes.io/projected/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-kube-api-access-xkcqg\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.215676 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-catalog-content\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.215806 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-utilities\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.216574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-utilities\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.217716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-catalog-content\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.238849 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkcqg\" (UniqueName: \"kubernetes.io/projected/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-kube-api-access-xkcqg\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.300786 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.793741 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:42:58 crc kubenswrapper[4722]: W0226 20:42:58.794923 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde1abe77_7ea4_451a_aa5d_7bd0605ebbe5.slice/crio-c68b9d20c118dabcbc950cddc06acb8eda6efb3efba4db8205cf89eee1eba7df WatchSource:0}: Error finding container c68b9d20c118dabcbc950cddc06acb8eda6efb3efba4db8205cf89eee1eba7df: Status 404 returned error can't find the container with id c68b9d20c118dabcbc950cddc06acb8eda6efb3efba4db8205cf89eee1eba7df Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.859467 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerStarted","Data":"c68b9d20c118dabcbc950cddc06acb8eda6efb3efba4db8205cf89eee1eba7df"} Feb 26 20:42:59 crc kubenswrapper[4722]: I0226 20:42:59.877111 4722 generic.go:334] "Generic (PLEG): container finished" podID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerID="75c0dbcfd458093bfc0e2eb7ba887e489cabe2151aed3040d797e05145938e83" exitCode=0 Feb 26 20:42:59 crc kubenswrapper[4722]: I0226 20:42:59.877224 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerDied","Data":"75c0dbcfd458093bfc0e2eb7ba887e489cabe2151aed3040d797e05145938e83"} Feb 26 20:43:04 crc kubenswrapper[4722]: I0226 20:43:04.923872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerStarted","Data":"de95ee95874ad60fdba50e28315671814250f49e2d745be47a8b1c43ec87dd12"} Feb 26 20:43:05 crc kubenswrapper[4722]: I0226 20:43:05.937105 4722 generic.go:334] "Generic (PLEG): container finished" podID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerID="de95ee95874ad60fdba50e28315671814250f49e2d745be47a8b1c43ec87dd12" exitCode=0 Feb 26 20:43:05 crc kubenswrapper[4722]: I0226 20:43:05.937163 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerDied","Data":"de95ee95874ad60fdba50e28315671814250f49e2d745be47a8b1c43ec87dd12"} Feb 26 20:43:05 crc kubenswrapper[4722]: I0226 20:43:05.940176 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:43:06 crc kubenswrapper[4722]: I0226 20:43:06.947774 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerStarted","Data":"fa610d946bfd2dd7afb0707eb935208e9084b0ec5072709fac95aa3fcf9e30f3"} Feb 26 20:43:06 crc kubenswrapper[4722]: I0226 20:43:06.965342 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zkkbw" podStartSLOduration=3.199144548 podStartE2EDuration="9.965323603s" podCreationTimestamp="2026-02-26 20:42:57 +0000 UTC" firstStartedPulling="2026-02-26 20:42:59.880231369 +0000 UTC m=+2922.417199293" lastFinishedPulling="2026-02-26 20:43:06.646410414 +0000 UTC m=+2929.183378348" observedRunningTime="2026-02-26 20:43:06.964953873 +0000 UTC m=+2929.501921807" watchObservedRunningTime="2026-02-26 20:43:06.965323603 +0000 UTC m=+2929.502291537" Feb 26 20:43:08 crc kubenswrapper[4722]: I0226 20:43:08.301126 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:43:08 crc kubenswrapper[4722]: I0226 20:43:08.301304 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:43:09 crc kubenswrapper[4722]: I0226 20:43:09.346924 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-zkkbw" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="registry-server" probeResult="failure" output=< Feb 26 20:43:09 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:43:09 crc kubenswrapper[4722]: > Feb 26 20:43:18 crc kubenswrapper[4722]: I0226 20:43:18.408901 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:43:18 crc kubenswrapper[4722]: I0226 20:43:18.468327 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:43:18 crc kubenswrapper[4722]: I0226 20:43:18.531572 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:43:18 crc kubenswrapper[4722]: I0226 20:43:18.653997 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8tbpk"] Feb 26 20:43:18 crc kubenswrapper[4722]: I0226 20:43:18.654272 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8tbpk" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="registry-server" containerID="cri-o://ad33a1f4305c9dd234c51f33ac96ab77331ccb2eef9a4f1319f1f48c1029960e" gracePeriod=2 Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.072231 4722 generic.go:334] "Generic (PLEG): container finished" podID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerID="ad33a1f4305c9dd234c51f33ac96ab77331ccb2eef9a4f1319f1f48c1029960e" exitCode=0 Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.072347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerDied","Data":"ad33a1f4305c9dd234c51f33ac96ab77331ccb2eef9a4f1319f1f48c1029960e"} Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.072571 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerDied","Data":"f09ae6b96d1fe5926507b0c598918436485427770870540213f4409934bc8d64"} Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.072585 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f09ae6b96d1fe5926507b0c598918436485427770870540213f4409934bc8d64" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.155860 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.345215 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-utilities\") pod \"704856f2-b29f-4fc8-8f18-a59104f507e9\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.345330 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-catalog-content\") pod \"704856f2-b29f-4fc8-8f18-a59104f507e9\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.345467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2tmc\" (UniqueName: \"kubernetes.io/projected/704856f2-b29f-4fc8-8f18-a59104f507e9-kube-api-access-g2tmc\") pod \"704856f2-b29f-4fc8-8f18-a59104f507e9\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.346164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-utilities" (OuterVolumeSpecName: "utilities") pod "704856f2-b29f-4fc8-8f18-a59104f507e9" (UID: "704856f2-b29f-4fc8-8f18-a59104f507e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.351134 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/704856f2-b29f-4fc8-8f18-a59104f507e9-kube-api-access-g2tmc" (OuterVolumeSpecName: "kube-api-access-g2tmc") pod "704856f2-b29f-4fc8-8f18-a59104f507e9" (UID: "704856f2-b29f-4fc8-8f18-a59104f507e9"). InnerVolumeSpecName "kube-api-access-g2tmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.400754 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "704856f2-b29f-4fc8-8f18-a59104f507e9" (UID: "704856f2-b29f-4fc8-8f18-a59104f507e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.448994 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.449050 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2tmc\" (UniqueName: \"kubernetes.io/projected/704856f2-b29f-4fc8-8f18-a59104f507e9-kube-api-access-g2tmc\") on node \"crc\" DevicePath \"\"" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.449065 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:43:20 crc kubenswrapper[4722]: I0226 20:43:20.080757 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:43:20 crc kubenswrapper[4722]: I0226 20:43:20.115414 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8tbpk"] Feb 26 20:43:20 crc kubenswrapper[4722]: I0226 20:43:20.133261 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8tbpk"] Feb 26 20:43:20 crc kubenswrapper[4722]: I0226 20:43:20.160930 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" path="/var/lib/kubelet/pods/704856f2-b29f-4fc8-8f18-a59104f507e9/volumes" Feb 26 20:43:23 crc kubenswrapper[4722]: I0226 20:43:23.487032 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:43:23 crc kubenswrapper[4722]: I0226 20:43:23.487607 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:43:23 crc kubenswrapper[4722]: I0226 20:43:23.487649 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:43:23 crc kubenswrapper[4722]: I0226 20:43:23.488529 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:43:23 crc kubenswrapper[4722]: I0226 20:43:23.488591 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" gracePeriod=600 Feb 26 20:43:23 crc kubenswrapper[4722]: E0226 20:43:23.616450 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:43:24 crc kubenswrapper[4722]: I0226 20:43:24.117102 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" exitCode=0 Feb 26 20:43:24 crc kubenswrapper[4722]: I0226 20:43:24.117166 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1"} Feb 26 20:43:24 crc kubenswrapper[4722]: I0226 20:43:24.117469 4722 scope.go:117] "RemoveContainer" containerID="fa56aa146aca89a64c60a5624b26de62c5d06783635e422c7b603bf29c2911a6" Feb 26 20:43:24 crc kubenswrapper[4722]: I0226 20:43:24.118341 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:43:24 crc kubenswrapper[4722]: E0226 20:43:24.118951 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:43:28 crc kubenswrapper[4722]: I0226 20:43:28.258733 4722 scope.go:117] "RemoveContainer" containerID="4bfc46d975d6a2fe85f799503e23d583e621d051ecf8db1005b076b08d316a77" Feb 26 20:43:28 crc kubenswrapper[4722]: I0226 20:43:28.284412 4722 scope.go:117] "RemoveContainer" containerID="ad33a1f4305c9dd234c51f33ac96ab77331ccb2eef9a4f1319f1f48c1029960e" Feb 26 20:43:28 crc kubenswrapper[4722]: I0226 20:43:28.345163 4722 scope.go:117] "RemoveContainer" containerID="54c86c10bac6d7c802a0ea18ff9bff59817ecb5ce79a933c8f7dcc0ba591dd41" Feb 26 20:43:36 crc kubenswrapper[4722]: I0226 20:43:36.146644 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:43:36 crc kubenswrapper[4722]: E0226 20:43:36.148287 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:43:48 crc kubenswrapper[4722]: I0226 20:43:48.173206 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:43:48 crc kubenswrapper[4722]: E0226 20:43:48.174924 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.139998 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535644-929gt"] Feb 26 20:44:00 crc kubenswrapper[4722]: E0226 20:44:00.140852 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="extract-utilities" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.140863 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="extract-utilities" Feb 26 20:44:00 crc kubenswrapper[4722]: E0226 20:44:00.140892 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="extract-content" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.140898 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="extract-content" Feb 26 20:44:00 crc kubenswrapper[4722]: E0226 20:44:00.140918 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="registry-server" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.140924 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="registry-server" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.141115 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="registry-server" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.141861 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.146871 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.147297 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.147455 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.162292 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535644-929gt"] Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.303709 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4lxr\" (UniqueName: \"kubernetes.io/projected/d17ea072-9011-410f-ae84-267fefe73604-kube-api-access-x4lxr\") pod \"auto-csr-approver-29535644-929gt\" (UID: \"d17ea072-9011-410f-ae84-267fefe73604\") " pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.406161 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lxr\" (UniqueName: \"kubernetes.io/projected/d17ea072-9011-410f-ae84-267fefe73604-kube-api-access-x4lxr\") pod \"auto-csr-approver-29535644-929gt\" (UID: \"d17ea072-9011-410f-ae84-267fefe73604\") " pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.427954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4lxr\" (UniqueName: \"kubernetes.io/projected/d17ea072-9011-410f-ae84-267fefe73604-kube-api-access-x4lxr\") pod \"auto-csr-approver-29535644-929gt\" (UID: \"d17ea072-9011-410f-ae84-267fefe73604\") " pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.459557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.929283 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535644-929gt"] Feb 26 20:44:01 crc kubenswrapper[4722]: I0226 20:44:01.466088 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535644-929gt" event={"ID":"d17ea072-9011-410f-ae84-267fefe73604","Type":"ContainerStarted","Data":"beed72e9adf26cd36a6f1b53d367b6aff4d96f211f8fb1e7ae29fc48ba022419"} Feb 26 20:44:03 crc kubenswrapper[4722]: I0226 20:44:03.145887 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:44:03 crc kubenswrapper[4722]: E0226 20:44:03.146429 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:44:03 crc kubenswrapper[4722]: I0226 20:44:03.486984 4722 generic.go:334] "Generic (PLEG): container finished" podID="d17ea072-9011-410f-ae84-267fefe73604" containerID="aea4aba0d422684bb32693d6faf22685a28205244204ce5e06223a57dc55b475" exitCode=0 Feb 26 20:44:03 crc kubenswrapper[4722]: I0226 20:44:03.487034 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535644-929gt" event={"ID":"d17ea072-9011-410f-ae84-267fefe73604","Type":"ContainerDied","Data":"aea4aba0d422684bb32693d6faf22685a28205244204ce5e06223a57dc55b475"} Feb 26 20:44:04 crc kubenswrapper[4722]: I0226 20:44:04.954601 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:05 crc kubenswrapper[4722]: I0226 20:44:05.106766 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4lxr\" (UniqueName: \"kubernetes.io/projected/d17ea072-9011-410f-ae84-267fefe73604-kube-api-access-x4lxr\") pod \"d17ea072-9011-410f-ae84-267fefe73604\" (UID: \"d17ea072-9011-410f-ae84-267fefe73604\") " Feb 26 20:44:05 crc kubenswrapper[4722]: I0226 20:44:05.114019 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17ea072-9011-410f-ae84-267fefe73604-kube-api-access-x4lxr" (OuterVolumeSpecName: "kube-api-access-x4lxr") pod "d17ea072-9011-410f-ae84-267fefe73604" (UID: "d17ea072-9011-410f-ae84-267fefe73604"). InnerVolumeSpecName "kube-api-access-x4lxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:44:05 crc kubenswrapper[4722]: I0226 20:44:05.209920 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4lxr\" (UniqueName: \"kubernetes.io/projected/d17ea072-9011-410f-ae84-267fefe73604-kube-api-access-x4lxr\") on node \"crc\" DevicePath \"\"" Feb 26 20:44:05 crc kubenswrapper[4722]: I0226 20:44:05.507791 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535644-929gt" event={"ID":"d17ea072-9011-410f-ae84-267fefe73604","Type":"ContainerDied","Data":"beed72e9adf26cd36a6f1b53d367b6aff4d96f211f8fb1e7ae29fc48ba022419"} Feb 26 20:44:05 crc kubenswrapper[4722]: I0226 20:44:05.507842 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beed72e9adf26cd36a6f1b53d367b6aff4d96f211f8fb1e7ae29fc48ba022419" Feb 26 20:44:05 crc kubenswrapper[4722]: I0226 20:44:05.507887 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:06 crc kubenswrapper[4722]: I0226 20:44:06.020872 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535638-t5gbq"] Feb 26 20:44:06 crc kubenswrapper[4722]: I0226 20:44:06.030326 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535638-t5gbq"] Feb 26 20:44:06 crc kubenswrapper[4722]: I0226 20:44:06.158837 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="503976e9-dfb6-46c7-96af-9e53160418ac" path="/var/lib/kubelet/pods/503976e9-dfb6-46c7-96af-9e53160418ac/volumes" Feb 26 20:44:15 crc kubenswrapper[4722]: I0226 20:44:15.146395 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:44:15 crc kubenswrapper[4722]: E0226 20:44:15.147147 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:44:28 crc kubenswrapper[4722]: I0226 20:44:28.437396 4722 scope.go:117] "RemoveContainer" containerID="70359eed1bd1f6327f64f1caf5e809aae473db4d69d64afe8d518f0482e5fe64" Feb 26 20:44:29 crc kubenswrapper[4722]: I0226 20:44:29.146355 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:44:29 crc kubenswrapper[4722]: E0226 20:44:29.147045 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:44:41 crc kubenswrapper[4722]: I0226 20:44:41.145988 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:44:41 crc kubenswrapper[4722]: E0226 20:44:41.147429 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.662166 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2l5zj"] Feb 26 20:44:44 crc kubenswrapper[4722]: E0226 20:44:44.663180 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17ea072-9011-410f-ae84-267fefe73604" containerName="oc" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.663197 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17ea072-9011-410f-ae84-267fefe73604" containerName="oc" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.663440 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17ea072-9011-410f-ae84-267fefe73604" containerName="oc" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.665878 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.699121 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2l5zj"] Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.834708 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6pvj\" (UniqueName: \"kubernetes.io/projected/83424746-4509-4c5b-a59d-6c00f8eecd04-kube-api-access-h6pvj\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.834854 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-utilities\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.834959 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-catalog-content\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.937386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-catalog-content\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.938082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-catalog-content\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.938469 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6pvj\" (UniqueName: \"kubernetes.io/projected/83424746-4509-4c5b-a59d-6c00f8eecd04-kube-api-access-h6pvj\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.938562 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-utilities\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.938957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-utilities\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.961769 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6pvj\" (UniqueName: \"kubernetes.io/projected/83424746-4509-4c5b-a59d-6c00f8eecd04-kube-api-access-h6pvj\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:45 crc kubenswrapper[4722]: I0226 20:44:45.007663 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:45 crc kubenswrapper[4722]: I0226 20:44:45.495911 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2l5zj"] Feb 26 20:44:45 crc kubenswrapper[4722]: I0226 20:44:45.970862 4722 generic.go:334] "Generic (PLEG): container finished" podID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerID="321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9" exitCode=0 Feb 26 20:44:45 crc kubenswrapper[4722]: I0226 20:44:45.971947 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerDied","Data":"321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9"} Feb 26 20:44:45 crc kubenswrapper[4722]: I0226 20:44:45.972102 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerStarted","Data":"4be7728ce0ce39f273ef976a5ddd163249dab728c160f113907d92485ddfc1a3"} Feb 26 20:44:49 crc kubenswrapper[4722]: I0226 20:44:49.001053 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerStarted","Data":"26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5"} Feb 26 20:44:50 crc kubenswrapper[4722]: I0226 20:44:50.012995 4722 generic.go:334] "Generic (PLEG): container finished" podID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerID="26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5" exitCode=0 Feb 26 20:44:50 crc kubenswrapper[4722]: I0226 20:44:50.013043 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerDied","Data":"26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5"} Feb 26 20:44:51 crc kubenswrapper[4722]: I0226 20:44:51.025532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerStarted","Data":"af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275"} Feb 26 20:44:51 crc kubenswrapper[4722]: I0226 20:44:51.057605 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2l5zj" podStartSLOduration=2.615025808 podStartE2EDuration="7.057576379s" podCreationTimestamp="2026-02-26 20:44:44 +0000 UTC" firstStartedPulling="2026-02-26 20:44:45.973186861 +0000 UTC m=+3028.510154785" lastFinishedPulling="2026-02-26 20:44:50.415737432 +0000 UTC m=+3032.952705356" observedRunningTime="2026-02-26 20:44:51.04879248 +0000 UTC m=+3033.585760414" watchObservedRunningTime="2026-02-26 20:44:51.057576379 +0000 UTC m=+3033.594544323" Feb 26 20:44:55 crc kubenswrapper[4722]: I0226 20:44:55.008272 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:55 crc kubenswrapper[4722]: I0226 20:44:55.008590 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:55 crc kubenswrapper[4722]: I0226 20:44:55.060527 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:55 crc kubenswrapper[4722]: I0226 20:44:55.124410 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:55 crc kubenswrapper[4722]: I0226 20:44:55.146066 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:44:55 crc kubenswrapper[4722]: E0226 20:44:55.146371 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:44:55 crc kubenswrapper[4722]: I0226 20:44:55.301551 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2l5zj"] Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.082433 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2l5zj" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="registry-server" containerID="cri-o://af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275" gracePeriod=2 Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.590327 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.599751 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6pvj\" (UniqueName: \"kubernetes.io/projected/83424746-4509-4c5b-a59d-6c00f8eecd04-kube-api-access-h6pvj\") pod \"83424746-4509-4c5b-a59d-6c00f8eecd04\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.599837 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-utilities\") pod \"83424746-4509-4c5b-a59d-6c00f8eecd04\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.599909 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-catalog-content\") pod \"83424746-4509-4c5b-a59d-6c00f8eecd04\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.605564 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-utilities" (OuterVolumeSpecName: "utilities") pod "83424746-4509-4c5b-a59d-6c00f8eecd04" (UID: "83424746-4509-4c5b-a59d-6c00f8eecd04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.629668 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83424746-4509-4c5b-a59d-6c00f8eecd04" (UID: "83424746-4509-4c5b-a59d-6c00f8eecd04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.634484 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83424746-4509-4c5b-a59d-6c00f8eecd04-kube-api-access-h6pvj" (OuterVolumeSpecName: "kube-api-access-h6pvj") pod "83424746-4509-4c5b-a59d-6c00f8eecd04" (UID: "83424746-4509-4c5b-a59d-6c00f8eecd04"). InnerVolumeSpecName "kube-api-access-h6pvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.703267 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.703296 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6pvj\" (UniqueName: \"kubernetes.io/projected/83424746-4509-4c5b-a59d-6c00f8eecd04-kube-api-access-h6pvj\") on node \"crc\" DevicePath \"\"" Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.703310 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.096110 4722 generic.go:334] "Generic (PLEG): container finished" podID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerID="af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275" exitCode=0 Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.096163 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerDied","Data":"af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275"} Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.096206 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerDied","Data":"4be7728ce0ce39f273ef976a5ddd163249dab728c160f113907d92485ddfc1a3"} Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.096223 4722 scope.go:117] "RemoveContainer" containerID="af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.096269 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.118317 4722 scope.go:117] "RemoveContainer" containerID="26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.154918 4722 scope.go:117] "RemoveContainer" containerID="321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.166035 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2l5zj"] Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.166073 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2l5zj"] Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.221127 4722 scope.go:117] "RemoveContainer" containerID="af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275" Feb 26 20:44:58 crc kubenswrapper[4722]: E0226 20:44:58.221618 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275\": container with ID starting with af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275 not found: ID does not exist" containerID="af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.221666 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275"} err="failed to get container status \"af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275\": rpc error: code = NotFound desc = could not find container \"af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275\": container with ID starting with af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275 not found: ID does not exist" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.221695 4722 scope.go:117] "RemoveContainer" containerID="26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5" Feb 26 20:44:58 crc kubenswrapper[4722]: E0226 20:44:58.222113 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5\": container with ID starting with 26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5 not found: ID does not exist" containerID="26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.222183 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5"} err="failed to get container status \"26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5\": rpc error: code = NotFound desc = could not find container \"26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5\": container with ID starting with 26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5 not found: ID does not exist" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.222209 4722 scope.go:117] "RemoveContainer" containerID="321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9" Feb 26 20:44:58 crc kubenswrapper[4722]: E0226 20:44:58.222515 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9\": container with ID starting with 321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9 not found: ID does not exist" containerID="321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.222615 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9"} err="failed to get container status \"321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9\": rpc error: code = NotFound desc = could not find container \"321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9\": container with ID starting with 321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9 not found: ID does not exist" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.158812 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" path="/var/lib/kubelet/pods/83424746-4509-4c5b-a59d-6c00f8eecd04/volumes" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.160815 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq"] Feb 26 20:45:00 crc kubenswrapper[4722]: E0226 20:45:00.161238 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="extract-utilities" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.161309 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="extract-utilities" Feb 26 20:45:00 crc kubenswrapper[4722]: E0226 20:45:00.161405 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="registry-server" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.161456 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="registry-server" Feb 26 20:45:00 crc kubenswrapper[4722]: E0226 20:45:00.161524 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="extract-content" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.161577 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="extract-content" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.161829 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="registry-server" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.162711 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.164908 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.165255 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.165611 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq"] Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.349651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-config-volume\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.349760 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-secret-volume\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.349826 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khlrf\" (UniqueName: \"kubernetes.io/projected/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-kube-api-access-khlrf\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.451291 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-config-volume\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.451366 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-secret-volume\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.451418 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khlrf\" (UniqueName: \"kubernetes.io/projected/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-kube-api-access-khlrf\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.452156 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-config-volume\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.456437 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-secret-volume\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.471196 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khlrf\" (UniqueName: \"kubernetes.io/projected/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-kube-api-access-khlrf\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.488417 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:01 crc kubenswrapper[4722]: I0226 20:45:01.019015 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq"] Feb 26 20:45:01 crc kubenswrapper[4722]: I0226 20:45:01.127717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" event={"ID":"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac","Type":"ContainerStarted","Data":"cebd6e3dbe6d4c7b14c64b4df3203fd0b3566d39d8419828a5cc0a9854d95561"} Feb 26 20:45:02 crc kubenswrapper[4722]: I0226 20:45:02.138401 4722 generic.go:334] "Generic (PLEG): container finished" podID="4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" containerID="ba18e16e62e3003ecd59cc6d50346b8f4d9d1c21189513898aa6568f86a33abd" exitCode=0 Feb 26 20:45:02 crc kubenswrapper[4722]: I0226 20:45:02.138465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" event={"ID":"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac","Type":"ContainerDied","Data":"ba18e16e62e3003ecd59cc6d50346b8f4d9d1c21189513898aa6568f86a33abd"} Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.630254 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.831491 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-secret-volume\") pod \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.832207 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khlrf\" (UniqueName: \"kubernetes.io/projected/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-kube-api-access-khlrf\") pod \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.832304 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-config-volume\") pod \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.833238 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-config-volume" (OuterVolumeSpecName: "config-volume") pod "4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" (UID: "4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.838849 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" (UID: "4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.843460 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-kube-api-access-khlrf" (OuterVolumeSpecName: "kube-api-access-khlrf") pod "4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" (UID: "4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac"). InnerVolumeSpecName "kube-api-access-khlrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.934795 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khlrf\" (UniqueName: \"kubernetes.io/projected/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-kube-api-access-khlrf\") on node \"crc\" DevicePath \"\"" Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.934833 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.934845 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:45:04 crc kubenswrapper[4722]: I0226 20:45:04.158488 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:04 crc kubenswrapper[4722]: I0226 20:45:04.161800 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" event={"ID":"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac","Type":"ContainerDied","Data":"cebd6e3dbe6d4c7b14c64b4df3203fd0b3566d39d8419828a5cc0a9854d95561"} Feb 26 20:45:04 crc kubenswrapper[4722]: I0226 20:45:04.161851 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cebd6e3dbe6d4c7b14c64b4df3203fd0b3566d39d8419828a5cc0a9854d95561" Feb 26 20:45:04 crc kubenswrapper[4722]: I0226 20:45:04.701682 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg"] Feb 26 20:45:04 crc kubenswrapper[4722]: I0226 20:45:04.711332 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg"] Feb 26 20:45:06 crc kubenswrapper[4722]: I0226 20:45:06.160707 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7115d78f-2013-4549-ab88-5fde72d4267f" path="/var/lib/kubelet/pods/7115d78f-2013-4549-ab88-5fde72d4267f/volumes" Feb 26 20:45:07 crc kubenswrapper[4722]: I0226 20:45:07.146667 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:45:07 crc kubenswrapper[4722]: E0226 20:45:07.147272 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:45:20 crc kubenswrapper[4722]: I0226 20:45:20.146767 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:45:20 crc kubenswrapper[4722]: E0226 20:45:20.147588 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:45:28 crc kubenswrapper[4722]: I0226 20:45:28.514931 4722 scope.go:117] "RemoveContainer" containerID="9f8338dca0289df96314b3dfe6dd02889f044c81b0c1093e855bda6ad20cc34c" Feb 26 20:45:31 crc kubenswrapper[4722]: I0226 20:45:31.146974 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:45:31 crc kubenswrapper[4722]: E0226 20:45:31.147869 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:45:45 crc kubenswrapper[4722]: I0226 20:45:45.146388 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:45:45 crc kubenswrapper[4722]: E0226 20:45:45.147170 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:45:56 crc kubenswrapper[4722]: I0226 20:45:56.146772 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:45:56 crc kubenswrapper[4722]: E0226 20:45:56.147581 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.161008 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535646-tc6wd"] Feb 26 20:46:00 crc kubenswrapper[4722]: E0226 20:46:00.161769 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" containerName="collect-profiles" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.161782 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" containerName="collect-profiles" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.162005 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" containerName="collect-profiles" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.162767 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.163947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t88cx\" (UniqueName: \"kubernetes.io/projected/a7591020-38d2-4c4d-9c5f-958bc7a73ea8-kube-api-access-t88cx\") pod \"auto-csr-approver-29535646-tc6wd\" (UID: \"a7591020-38d2-4c4d-9c5f-958bc7a73ea8\") " pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.165800 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.166039 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.166245 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.170714 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535646-tc6wd"] Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.265950 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t88cx\" (UniqueName: \"kubernetes.io/projected/a7591020-38d2-4c4d-9c5f-958bc7a73ea8-kube-api-access-t88cx\") pod \"auto-csr-approver-29535646-tc6wd\" (UID: \"a7591020-38d2-4c4d-9c5f-958bc7a73ea8\") " pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.282876 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t88cx\" (UniqueName: \"kubernetes.io/projected/a7591020-38d2-4c4d-9c5f-958bc7a73ea8-kube-api-access-t88cx\") pod \"auto-csr-approver-29535646-tc6wd\" (UID: \"a7591020-38d2-4c4d-9c5f-958bc7a73ea8\") " pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.481191 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:00 crc kubenswrapper[4722]: W0226 20:46:00.972644 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7591020_38d2_4c4d_9c5f_958bc7a73ea8.slice/crio-6d9f8fc4dcb5688a6b0397964a5ee5a32a0940dc5f49e9712e9b3eee42d8adeb WatchSource:0}: Error finding container 6d9f8fc4dcb5688a6b0397964a5ee5a32a0940dc5f49e9712e9b3eee42d8adeb: Status 404 returned error can't find the container with id 6d9f8fc4dcb5688a6b0397964a5ee5a32a0940dc5f49e9712e9b3eee42d8adeb Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.975307 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535646-tc6wd"] Feb 26 20:46:01 crc kubenswrapper[4722]: I0226 20:46:01.253578 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" event={"ID":"a7591020-38d2-4c4d-9c5f-958bc7a73ea8","Type":"ContainerStarted","Data":"6d9f8fc4dcb5688a6b0397964a5ee5a32a0940dc5f49e9712e9b3eee42d8adeb"} Feb 26 20:46:03 crc kubenswrapper[4722]: I0226 20:46:03.272790 4722 generic.go:334] "Generic (PLEG): container finished" podID="a7591020-38d2-4c4d-9c5f-958bc7a73ea8" containerID="498ce08c79d834f797dcbabcb8fd52f295d80972d32d2674d6a82ab9209821e7" exitCode=0 Feb 26 20:46:03 crc kubenswrapper[4722]: I0226 20:46:03.273072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" event={"ID":"a7591020-38d2-4c4d-9c5f-958bc7a73ea8","Type":"ContainerDied","Data":"498ce08c79d834f797dcbabcb8fd52f295d80972d32d2674d6a82ab9209821e7"} Feb 26 20:46:04 crc kubenswrapper[4722]: I0226 20:46:04.761276 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:04 crc kubenswrapper[4722]: I0226 20:46:04.873584 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t88cx\" (UniqueName: \"kubernetes.io/projected/a7591020-38d2-4c4d-9c5f-958bc7a73ea8-kube-api-access-t88cx\") pod \"a7591020-38d2-4c4d-9c5f-958bc7a73ea8\" (UID: \"a7591020-38d2-4c4d-9c5f-958bc7a73ea8\") " Feb 26 20:46:04 crc kubenswrapper[4722]: I0226 20:46:04.880405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7591020-38d2-4c4d-9c5f-958bc7a73ea8-kube-api-access-t88cx" (OuterVolumeSpecName: "kube-api-access-t88cx") pod "a7591020-38d2-4c4d-9c5f-958bc7a73ea8" (UID: "a7591020-38d2-4c4d-9c5f-958bc7a73ea8"). InnerVolumeSpecName "kube-api-access-t88cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:46:04 crc kubenswrapper[4722]: I0226 20:46:04.975758 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t88cx\" (UniqueName: \"kubernetes.io/projected/a7591020-38d2-4c4d-9c5f-958bc7a73ea8-kube-api-access-t88cx\") on node \"crc\" DevicePath \"\"" Feb 26 20:46:05 crc kubenswrapper[4722]: I0226 20:46:05.293204 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" event={"ID":"a7591020-38d2-4c4d-9c5f-958bc7a73ea8","Type":"ContainerDied","Data":"6d9f8fc4dcb5688a6b0397964a5ee5a32a0940dc5f49e9712e9b3eee42d8adeb"} Feb 26 20:46:05 crc kubenswrapper[4722]: I0226 20:46:05.293533 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d9f8fc4dcb5688a6b0397964a5ee5a32a0940dc5f49e9712e9b3eee42d8adeb" Feb 26 20:46:05 crc kubenswrapper[4722]: I0226 20:46:05.293255 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:05 crc kubenswrapper[4722]: E0226 20:46:05.364839 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7591020_38d2_4c4d_9c5f_958bc7a73ea8.slice\": RecentStats: unable to find data in memory cache]" Feb 26 20:46:05 crc kubenswrapper[4722]: I0226 20:46:05.836693 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535640-dvlm9"] Feb 26 20:46:05 crc kubenswrapper[4722]: I0226 20:46:05.845037 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535640-dvlm9"] Feb 26 20:46:06 crc kubenswrapper[4722]: I0226 20:46:06.160855 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d46cdb69-f149-44bc-bb3e-6f8b94e937c3" path="/var/lib/kubelet/pods/d46cdb69-f149-44bc-bb3e-6f8b94e937c3/volumes" Feb 26 20:46:08 crc kubenswrapper[4722]: I0226 20:46:08.155899 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:46:08 crc kubenswrapper[4722]: E0226 20:46:08.156837 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:46:20 crc kubenswrapper[4722]: I0226 20:46:20.146335 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:46:20 crc kubenswrapper[4722]: E0226 20:46:20.147051 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:46:28 crc kubenswrapper[4722]: I0226 20:46:28.581733 4722 scope.go:117] "RemoveContainer" containerID="d1d12fedd8dee91b449932d270c358066711fb42aa8f2cbf91cf3dec9a137e05" Feb 26 20:46:34 crc kubenswrapper[4722]: I0226 20:46:34.147415 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:46:34 crc kubenswrapper[4722]: E0226 20:46:34.148588 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:46:45 crc kubenswrapper[4722]: I0226 20:46:45.146252 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:46:45 crc kubenswrapper[4722]: E0226 20:46:45.148467 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:46:56 crc kubenswrapper[4722]: I0226 20:46:56.147090 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:46:56 crc kubenswrapper[4722]: E0226 20:46:56.148784 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:47:11 crc kubenswrapper[4722]: I0226 20:47:11.146559 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:47:11 crc kubenswrapper[4722]: E0226 20:47:11.147340 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:47:25 crc kubenswrapper[4722]: I0226 20:47:25.146226 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:47:25 crc kubenswrapper[4722]: E0226 20:47:25.147042 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:47:40 crc kubenswrapper[4722]: I0226 20:47:40.146192 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:47:40 crc kubenswrapper[4722]: E0226 20:47:40.146885 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:47:53 crc kubenswrapper[4722]: I0226 20:47:53.146713 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:47:53 crc kubenswrapper[4722]: E0226 20:47:53.147876 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.137064 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535648-4vfsj"] Feb 26 20:48:00 crc kubenswrapper[4722]: E0226 20:48:00.138068 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7591020-38d2-4c4d-9c5f-958bc7a73ea8" containerName="oc" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.138084 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7591020-38d2-4c4d-9c5f-958bc7a73ea8" containerName="oc" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.138280 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7591020-38d2-4c4d-9c5f-958bc7a73ea8" containerName="oc" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.139154 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.143796 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.143884 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.149327 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.157407 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535648-4vfsj"] Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.191129 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bd9p\" (UniqueName: \"kubernetes.io/projected/717dda76-7ae7-403a-92e5-5e268a396d1d-kube-api-access-2bd9p\") pod \"auto-csr-approver-29535648-4vfsj\" (UID: \"717dda76-7ae7-403a-92e5-5e268a396d1d\") " pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.293266 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bd9p\" (UniqueName: \"kubernetes.io/projected/717dda76-7ae7-403a-92e5-5e268a396d1d-kube-api-access-2bd9p\") pod \"auto-csr-approver-29535648-4vfsj\" (UID: \"717dda76-7ae7-403a-92e5-5e268a396d1d\") " pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.313553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bd9p\" (UniqueName: \"kubernetes.io/projected/717dda76-7ae7-403a-92e5-5e268a396d1d-kube-api-access-2bd9p\") pod \"auto-csr-approver-29535648-4vfsj\" (UID: \"717dda76-7ae7-403a-92e5-5e268a396d1d\") " pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.497055 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.934063 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535648-4vfsj"] Feb 26 20:48:01 crc kubenswrapper[4722]: I0226 20:48:01.719447 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" event={"ID":"717dda76-7ae7-403a-92e5-5e268a396d1d","Type":"ContainerStarted","Data":"b1e1abec99bfe8fc3ca653b53f417f9829868a555345e50fdf276d21a1be4610"} Feb 26 20:48:02 crc kubenswrapper[4722]: I0226 20:48:02.729818 4722 generic.go:334] "Generic (PLEG): container finished" podID="717dda76-7ae7-403a-92e5-5e268a396d1d" containerID="1c090f236672e9878bb5c5bf9aaaeb7db1a4e06a69d92511bbd1f90fae3446a5" exitCode=0 Feb 26 20:48:02 crc kubenswrapper[4722]: I0226 20:48:02.730228 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" event={"ID":"717dda76-7ae7-403a-92e5-5e268a396d1d","Type":"ContainerDied","Data":"1c090f236672e9878bb5c5bf9aaaeb7db1a4e06a69d92511bbd1f90fae3446a5"} Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.147658 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:48:04 crc kubenswrapper[4722]: E0226 20:48:04.148273 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.148294 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.169663 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bd9p\" (UniqueName: \"kubernetes.io/projected/717dda76-7ae7-403a-92e5-5e268a396d1d-kube-api-access-2bd9p\") pod \"717dda76-7ae7-403a-92e5-5e268a396d1d\" (UID: \"717dda76-7ae7-403a-92e5-5e268a396d1d\") " Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.178054 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717dda76-7ae7-403a-92e5-5e268a396d1d-kube-api-access-2bd9p" (OuterVolumeSpecName: "kube-api-access-2bd9p") pod "717dda76-7ae7-403a-92e5-5e268a396d1d" (UID: "717dda76-7ae7-403a-92e5-5e268a396d1d"). InnerVolumeSpecName "kube-api-access-2bd9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.271663 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bd9p\" (UniqueName: \"kubernetes.io/projected/717dda76-7ae7-403a-92e5-5e268a396d1d-kube-api-access-2bd9p\") on node \"crc\" DevicePath \"\"" Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.750370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" event={"ID":"717dda76-7ae7-403a-92e5-5e268a396d1d","Type":"ContainerDied","Data":"b1e1abec99bfe8fc3ca653b53f417f9829868a555345e50fdf276d21a1be4610"} Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.750630 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1e1abec99bfe8fc3ca653b53f417f9829868a555345e50fdf276d21a1be4610" Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.750421 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:05 crc kubenswrapper[4722]: I0226 20:48:05.227408 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535642-9v9kv"] Feb 26 20:48:05 crc kubenswrapper[4722]: I0226 20:48:05.237948 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535642-9v9kv"] Feb 26 20:48:06 crc kubenswrapper[4722]: I0226 20:48:06.156575 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf44450-97f2-474b-abf8-9c306e6d5679" path="/var/lib/kubelet/pods/bcf44450-97f2-474b-abf8-9c306e6d5679/volumes" Feb 26 20:48:15 crc kubenswrapper[4722]: I0226 20:48:15.146359 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:48:15 crc kubenswrapper[4722]: E0226 20:48:15.147076 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.338706 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 26 20:48:28 crc kubenswrapper[4722]: E0226 20:48:28.339704 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717dda76-7ae7-403a-92e5-5e268a396d1d" containerName="oc" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.339719 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="717dda76-7ae7-403a-92e5-5e268a396d1d" containerName="oc" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.339941 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="717dda76-7ae7-403a-92e5-5e268a396d1d" containerName="oc" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.341006 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.346702 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.346890 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.347144 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.347261 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dvrdk" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.381781 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522365 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522455 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522528 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522631 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522670 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwmln\" (UniqueName: \"kubernetes.io/projected/48c7de81-f528-48d3-bb95-99a9cf36f43f-kube-api-access-fwmln\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522707 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-config-data\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522747 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522825 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624509 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624538 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624687 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624715 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwmln\" (UniqueName: \"kubernetes.io/projected/48c7de81-f528-48d3-bb95-99a9cf36f43f-kube-api-access-fwmln\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624739 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-config-data\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624767 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.625004 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.625053 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.625076 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.625844 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.626711 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-config-data\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.630318 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.630968 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.631264 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.643021 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwmln\" (UniqueName: \"kubernetes.io/projected/48c7de81-f528-48d3-bb95-99a9cf36f43f-kube-api-access-fwmln\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.661827 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.668587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.679557 4722 scope.go:117] "RemoveContainer" containerID="3f12f0d95667cd2f8d50bb9570c7de9cd1db62e57b4676452cb66f423535f60d" Feb 26 20:48:29 crc kubenswrapper[4722]: I0226 20:48:29.125133 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 26 20:48:29 crc kubenswrapper[4722]: I0226 20:48:29.128277 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:48:29 crc kubenswrapper[4722]: I0226 20:48:29.146309 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:48:29 crc kubenswrapper[4722]: I0226 20:48:29.985490 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"29b1b3ce04e03488d1e4fef03dfbf65ce74330e0045117dcf412a77a31f455fc"} Feb 26 20:48:29 crc kubenswrapper[4722]: I0226 20:48:29.987085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"48c7de81-f528-48d3-bb95-99a9cf36f43f","Type":"ContainerStarted","Data":"01d0174ea6131e31d193c901e0ddff2a98cffbc5208903ad8ffd8e9d84dd7e77"} Feb 26 20:49:02 crc kubenswrapper[4722]: E0226 20:49:02.552611 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 26 20:49:02 crc kubenswrapper[4722]: E0226 20:49:02.553374 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwmln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(48c7de81-f528-48d3-bb95-99a9cf36f43f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:49:02 crc kubenswrapper[4722]: E0226 20:49:02.556200 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="48c7de81-f528-48d3-bb95-99a9cf36f43f" Feb 26 20:49:03 crc kubenswrapper[4722]: E0226 20:49:03.324545 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="48c7de81-f528-48d3-bb95-99a9cf36f43f" Feb 26 20:49:18 crc kubenswrapper[4722]: I0226 20:49:18.463601 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"48c7de81-f528-48d3-bb95-99a9cf36f43f","Type":"ContainerStarted","Data":"bb39e7c551f11ccbf11e09ef8dc147a3877dc5e00083656711dfda7be5502b23"} Feb 26 20:49:18 crc kubenswrapper[4722]: I0226 20:49:18.504965 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.062075764 podStartE2EDuration="51.504944964s" podCreationTimestamp="2026-02-26 20:48:27 +0000 UTC" firstStartedPulling="2026-02-26 20:48:29.128002676 +0000 UTC m=+3251.664970600" lastFinishedPulling="2026-02-26 20:49:16.570871876 +0000 UTC m=+3299.107839800" observedRunningTime="2026-02-26 20:49:18.50441123 +0000 UTC m=+3301.041379154" watchObservedRunningTime="2026-02-26 20:49:18.504944964 +0000 UTC m=+3301.041912898" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.182192 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535650-zw96r"] Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.184157 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.185852 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.185987 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.186016 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.193746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535650-zw96r"] Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.279423 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw7v6\" (UniqueName: \"kubernetes.io/projected/28757563-9165-4c96-82ec-c961b940926a-kube-api-access-zw7v6\") pod \"auto-csr-approver-29535650-zw96r\" (UID: \"28757563-9165-4c96-82ec-c961b940926a\") " pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.381877 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw7v6\" (UniqueName: \"kubernetes.io/projected/28757563-9165-4c96-82ec-c961b940926a-kube-api-access-zw7v6\") pod \"auto-csr-approver-29535650-zw96r\" (UID: \"28757563-9165-4c96-82ec-c961b940926a\") " pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.406762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw7v6\" (UniqueName: \"kubernetes.io/projected/28757563-9165-4c96-82ec-c961b940926a-kube-api-access-zw7v6\") pod \"auto-csr-approver-29535650-zw96r\" (UID: \"28757563-9165-4c96-82ec-c961b940926a\") " pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.510477 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.970273 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535650-zw96r"] Feb 26 20:50:01 crc kubenswrapper[4722]: I0226 20:50:01.874204 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535650-zw96r" event={"ID":"28757563-9165-4c96-82ec-c961b940926a","Type":"ContainerStarted","Data":"030c91eafff4b46f8b2308af2fb0ac3b0508d8139786ccb040f8169339285c25"} Feb 26 20:50:02 crc kubenswrapper[4722]: I0226 20:50:02.883920 4722 generic.go:334] "Generic (PLEG): container finished" podID="28757563-9165-4c96-82ec-c961b940926a" containerID="1fa5adcbc0e1334a1ee837169477ee86eee474e4a55644b4a1175da4b5c4547b" exitCode=0 Feb 26 20:50:02 crc kubenswrapper[4722]: I0226 20:50:02.884019 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535650-zw96r" event={"ID":"28757563-9165-4c96-82ec-c961b940926a","Type":"ContainerDied","Data":"1fa5adcbc0e1334a1ee837169477ee86eee474e4a55644b4a1175da4b5c4547b"} Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.364338 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.498216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw7v6\" (UniqueName: \"kubernetes.io/projected/28757563-9165-4c96-82ec-c961b940926a-kube-api-access-zw7v6\") pod \"28757563-9165-4c96-82ec-c961b940926a\" (UID: \"28757563-9165-4c96-82ec-c961b940926a\") " Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.506753 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28757563-9165-4c96-82ec-c961b940926a-kube-api-access-zw7v6" (OuterVolumeSpecName: "kube-api-access-zw7v6") pod "28757563-9165-4c96-82ec-c961b940926a" (UID: "28757563-9165-4c96-82ec-c961b940926a"). InnerVolumeSpecName "kube-api-access-zw7v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.601106 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw7v6\" (UniqueName: \"kubernetes.io/projected/28757563-9165-4c96-82ec-c961b940926a-kube-api-access-zw7v6\") on node \"crc\" DevicePath \"\"" Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.903546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535650-zw96r" event={"ID":"28757563-9165-4c96-82ec-c961b940926a","Type":"ContainerDied","Data":"030c91eafff4b46f8b2308af2fb0ac3b0508d8139786ccb040f8169339285c25"} Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.903607 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="030c91eafff4b46f8b2308af2fb0ac3b0508d8139786ccb040f8169339285c25" Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.903610 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:05 crc kubenswrapper[4722]: I0226 20:50:05.472469 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535644-929gt"] Feb 26 20:50:05 crc kubenswrapper[4722]: I0226 20:50:05.489314 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535644-929gt"] Feb 26 20:50:06 crc kubenswrapper[4722]: I0226 20:50:06.158529 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17ea072-9011-410f-ae84-267fefe73604" path="/var/lib/kubelet/pods/d17ea072-9011-410f-ae84-267fefe73604/volumes" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.250821 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nprwh"] Feb 26 20:50:25 crc kubenswrapper[4722]: E0226 20:50:25.251856 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28757563-9165-4c96-82ec-c961b940926a" containerName="oc" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.251938 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="28757563-9165-4c96-82ec-c961b940926a" containerName="oc" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.252147 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="28757563-9165-4c96-82ec-c961b940926a" containerName="oc" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.253698 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.264125 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nprwh"] Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.337344 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqjzz\" (UniqueName: \"kubernetes.io/projected/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-kube-api-access-vqjzz\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.337402 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-catalog-content\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.337597 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-utilities\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.439393 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqjzz\" (UniqueName: \"kubernetes.io/projected/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-kube-api-access-vqjzz\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.439448 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-catalog-content\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.439531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-utilities\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.439972 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-utilities\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.440167 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-catalog-content\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.471235 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqjzz\" (UniqueName: \"kubernetes.io/projected/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-kube-api-access-vqjzz\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.582119 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:26 crc kubenswrapper[4722]: I0226 20:50:26.078994 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nprwh"] Feb 26 20:50:26 crc kubenswrapper[4722]: I0226 20:50:26.114221 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerStarted","Data":"ae25261716be93769db5fe41e80931b0b713d9d65ba4d9f090d4373706731a67"} Feb 26 20:50:27 crc kubenswrapper[4722]: I0226 20:50:27.144788 4722 generic.go:334] "Generic (PLEG): container finished" podID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerID="9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff" exitCode=0 Feb 26 20:50:27 crc kubenswrapper[4722]: I0226 20:50:27.145112 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerDied","Data":"9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff"} Feb 26 20:50:28 crc kubenswrapper[4722]: I0226 20:50:28.166626 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerStarted","Data":"e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be"} Feb 26 20:50:28 crc kubenswrapper[4722]: I0226 20:50:28.898808 4722 scope.go:117] "RemoveContainer" containerID="aea4aba0d422684bb32693d6faf22685a28205244204ce5e06223a57dc55b475" Feb 26 20:50:33 crc kubenswrapper[4722]: I0226 20:50:33.205981 4722 generic.go:334] "Generic (PLEG): container finished" podID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerID="e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be" exitCode=0 Feb 26 20:50:33 crc kubenswrapper[4722]: I0226 20:50:33.206090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerDied","Data":"e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be"} Feb 26 20:50:34 crc kubenswrapper[4722]: I0226 20:50:34.220852 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerStarted","Data":"5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f"} Feb 26 20:50:34 crc kubenswrapper[4722]: I0226 20:50:34.247422 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nprwh" podStartSLOduration=2.793106266 podStartE2EDuration="9.247398937s" podCreationTimestamp="2026-02-26 20:50:25 +0000 UTC" firstStartedPulling="2026-02-26 20:50:27.151000117 +0000 UTC m=+3369.687968041" lastFinishedPulling="2026-02-26 20:50:33.605292788 +0000 UTC m=+3376.142260712" observedRunningTime="2026-02-26 20:50:34.235468014 +0000 UTC m=+3376.772435958" watchObservedRunningTime="2026-02-26 20:50:34.247398937 +0000 UTC m=+3376.784366871" Feb 26 20:50:35 crc kubenswrapper[4722]: I0226 20:50:35.582961 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:35 crc kubenswrapper[4722]: I0226 20:50:35.584463 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:36 crc kubenswrapper[4722]: I0226 20:50:36.631462 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nprwh" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" probeResult="failure" output=< Feb 26 20:50:36 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:50:36 crc kubenswrapper[4722]: > Feb 26 20:50:46 crc kubenswrapper[4722]: I0226 20:50:46.643357 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nprwh" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" probeResult="failure" output=< Feb 26 20:50:46 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:50:46 crc kubenswrapper[4722]: > Feb 26 20:50:53 crc kubenswrapper[4722]: I0226 20:50:53.487919 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:50:53 crc kubenswrapper[4722]: I0226 20:50:53.488476 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:50:56 crc kubenswrapper[4722]: I0226 20:50:56.626824 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nprwh" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" probeResult="failure" output=< Feb 26 20:50:56 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:50:56 crc kubenswrapper[4722]: > Feb 26 20:51:05 crc kubenswrapper[4722]: I0226 20:51:05.630727 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:51:05 crc kubenswrapper[4722]: I0226 20:51:05.679527 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:51:05 crc kubenswrapper[4722]: I0226 20:51:05.869723 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nprwh"] Feb 26 20:51:07 crc kubenswrapper[4722]: I0226 20:51:07.536988 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nprwh" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" containerID="cri-o://5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f" gracePeriod=2 Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.389755 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.492953 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-utilities\") pod \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.493130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-catalog-content\") pod \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.493271 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqjzz\" (UniqueName: \"kubernetes.io/projected/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-kube-api-access-vqjzz\") pod \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.495275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-utilities" (OuterVolumeSpecName: "utilities") pod "0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" (UID: "0cc8f73e-a15b-4b79-a9eb-71ab3bc30328"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.509770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-kube-api-access-vqjzz" (OuterVolumeSpecName: "kube-api-access-vqjzz") pod "0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" (UID: "0cc8f73e-a15b-4b79-a9eb-71ab3bc30328"). InnerVolumeSpecName "kube-api-access-vqjzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.549916 4722 generic.go:334] "Generic (PLEG): container finished" podID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerID="5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f" exitCode=0 Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.549982 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.549988 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerDied","Data":"5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f"} Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.550117 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerDied","Data":"ae25261716be93769db5fe41e80931b0b713d9d65ba4d9f090d4373706731a67"} Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.550175 4722 scope.go:117] "RemoveContainer" containerID="5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.580856 4722 scope.go:117] "RemoveContainer" containerID="e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.595958 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqjzz\" (UniqueName: \"kubernetes.io/projected/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-kube-api-access-vqjzz\") on node \"crc\" DevicePath \"\"" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.595992 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.604477 4722 scope.go:117] "RemoveContainer" containerID="9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.643542 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" (UID: "0cc8f73e-a15b-4b79-a9eb-71ab3bc30328"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.656177 4722 scope.go:117] "RemoveContainer" containerID="5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f" Feb 26 20:51:08 crc kubenswrapper[4722]: E0226 20:51:08.658296 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f\": container with ID starting with 5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f not found: ID does not exist" containerID="5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.658351 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f"} err="failed to get container status \"5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f\": rpc error: code = NotFound desc = could not find container \"5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f\": container with ID starting with 5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f not found: ID does not exist" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.658386 4722 scope.go:117] "RemoveContainer" containerID="e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be" Feb 26 20:51:08 crc kubenswrapper[4722]: E0226 20:51:08.659210 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be\": container with ID starting with e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be not found: ID does not exist" containerID="e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.659254 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be"} err="failed to get container status \"e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be\": rpc error: code = NotFound desc = could not find container \"e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be\": container with ID starting with e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be not found: ID does not exist" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.659276 4722 scope.go:117] "RemoveContainer" containerID="9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff" Feb 26 20:51:08 crc kubenswrapper[4722]: E0226 20:51:08.659760 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff\": container with ID starting with 9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff not found: ID does not exist" containerID="9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.659796 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff"} err="failed to get container status \"9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff\": rpc error: code = NotFound desc = could not find container \"9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff\": container with ID starting with 9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff not found: ID does not exist" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.698728 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.894399 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nprwh"] Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.931637 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nprwh"] Feb 26 20:51:10 crc kubenswrapper[4722]: I0226 20:51:10.158421 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" path="/var/lib/kubelet/pods/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328/volumes" Feb 26 20:51:23 crc kubenswrapper[4722]: I0226 20:51:23.486848 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:51:23 crc kubenswrapper[4722]: I0226 20:51:23.487458 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.487209 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.487698 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.487739 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.488470 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29b1b3ce04e03488d1e4fef03dfbf65ce74330e0045117dcf412a77a31f455fc"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.488526 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://29b1b3ce04e03488d1e4fef03dfbf65ce74330e0045117dcf412a77a31f455fc" gracePeriod=600 Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.959841 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="29b1b3ce04e03488d1e4fef03dfbf65ce74330e0045117dcf412a77a31f455fc" exitCode=0 Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.959911 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"29b1b3ce04e03488d1e4fef03dfbf65ce74330e0045117dcf412a77a31f455fc"} Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.960278 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6"} Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.960303 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.166992 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535652-wbsbh"] Feb 26 20:52:00 crc kubenswrapper[4722]: E0226 20:52:00.168293 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="extract-utilities" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.168314 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="extract-utilities" Feb 26 20:52:00 crc kubenswrapper[4722]: E0226 20:52:00.168339 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.168347 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" Feb 26 20:52:00 crc kubenswrapper[4722]: E0226 20:52:00.168364 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="extract-content" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.168374 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="extract-content" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.168637 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.169962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.172771 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.173007 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.173254 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.184396 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535652-wbsbh"] Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.323173 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlsd5\" (UniqueName: \"kubernetes.io/projected/5af60092-6c8e-4807-a060-3e9e7276ac0c-kube-api-access-rlsd5\") pod \"auto-csr-approver-29535652-wbsbh\" (UID: \"5af60092-6c8e-4807-a060-3e9e7276ac0c\") " pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.425313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlsd5\" (UniqueName: \"kubernetes.io/projected/5af60092-6c8e-4807-a060-3e9e7276ac0c-kube-api-access-rlsd5\") pod \"auto-csr-approver-29535652-wbsbh\" (UID: \"5af60092-6c8e-4807-a060-3e9e7276ac0c\") " pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.444194 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlsd5\" (UniqueName: \"kubernetes.io/projected/5af60092-6c8e-4807-a060-3e9e7276ac0c-kube-api-access-rlsd5\") pod \"auto-csr-approver-29535652-wbsbh\" (UID: \"5af60092-6c8e-4807-a060-3e9e7276ac0c\") " pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.497494 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:01 crc kubenswrapper[4722]: I0226 20:52:01.040931 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535652-wbsbh"] Feb 26 20:52:01 crc kubenswrapper[4722]: I0226 20:52:01.065101 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" event={"ID":"5af60092-6c8e-4807-a060-3e9e7276ac0c","Type":"ContainerStarted","Data":"7f752aff841d8c5540712ba03a39dd2500ceb814fc2f3af6e2136cb0638e2d86"} Feb 26 20:52:04 crc kubenswrapper[4722]: I0226 20:52:04.092347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" event={"ID":"5af60092-6c8e-4807-a060-3e9e7276ac0c","Type":"ContainerStarted","Data":"093bce09c87eb1fcd55ab51cfd2246ca3f13ef5535e5d0505ae1d3112c4f1a0c"} Feb 26 20:52:04 crc kubenswrapper[4722]: I0226 20:52:04.110239 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" podStartSLOduration=1.48768707 podStartE2EDuration="4.11021865s" podCreationTimestamp="2026-02-26 20:52:00 +0000 UTC" firstStartedPulling="2026-02-26 20:52:01.045339138 +0000 UTC m=+3463.582307062" lastFinishedPulling="2026-02-26 20:52:03.667870718 +0000 UTC m=+3466.204838642" observedRunningTime="2026-02-26 20:52:04.10688267 +0000 UTC m=+3466.643850594" watchObservedRunningTime="2026-02-26 20:52:04.11021865 +0000 UTC m=+3466.647186574" Feb 26 20:52:05 crc kubenswrapper[4722]: I0226 20:52:05.103063 4722 generic.go:334] "Generic (PLEG): container finished" podID="5af60092-6c8e-4807-a060-3e9e7276ac0c" containerID="093bce09c87eb1fcd55ab51cfd2246ca3f13ef5535e5d0505ae1d3112c4f1a0c" exitCode=0 Feb 26 20:52:05 crc kubenswrapper[4722]: I0226 20:52:05.103213 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" event={"ID":"5af60092-6c8e-4807-a060-3e9e7276ac0c","Type":"ContainerDied","Data":"093bce09c87eb1fcd55ab51cfd2246ca3f13ef5535e5d0505ae1d3112c4f1a0c"} Feb 26 20:52:06 crc kubenswrapper[4722]: I0226 20:52:06.899448 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.078734 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlsd5\" (UniqueName: \"kubernetes.io/projected/5af60092-6c8e-4807-a060-3e9e7276ac0c-kube-api-access-rlsd5\") pod \"5af60092-6c8e-4807-a060-3e9e7276ac0c\" (UID: \"5af60092-6c8e-4807-a060-3e9e7276ac0c\") " Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.085274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5af60092-6c8e-4807-a060-3e9e7276ac0c-kube-api-access-rlsd5" (OuterVolumeSpecName: "kube-api-access-rlsd5") pod "5af60092-6c8e-4807-a060-3e9e7276ac0c" (UID: "5af60092-6c8e-4807-a060-3e9e7276ac0c"). InnerVolumeSpecName "kube-api-access-rlsd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.122569 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" event={"ID":"5af60092-6c8e-4807-a060-3e9e7276ac0c","Type":"ContainerDied","Data":"7f752aff841d8c5540712ba03a39dd2500ceb814fc2f3af6e2136cb0638e2d86"} Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.122610 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f752aff841d8c5540712ba03a39dd2500ceb814fc2f3af6e2136cb0638e2d86" Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.122620 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.185310 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlsd5\" (UniqueName: \"kubernetes.io/projected/5af60092-6c8e-4807-a060-3e9e7276ac0c-kube-api-access-rlsd5\") on node \"crc\" DevicePath \"\"" Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.202131 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535646-tc6wd"] Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.212592 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535646-tc6wd"] Feb 26 20:52:08 crc kubenswrapper[4722]: I0226 20:52:08.159831 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7591020-38d2-4c4d-9c5f-958bc7a73ea8" path="/var/lib/kubelet/pods/a7591020-38d2-4c4d-9c5f-958bc7a73ea8/volumes" Feb 26 20:52:29 crc kubenswrapper[4722]: I0226 20:52:29.017524 4722 scope.go:117] "RemoveContainer" containerID="498ce08c79d834f797dcbabcb8fd52f295d80972d32d2674d6a82ab9209821e7" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.125876 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-85cdr"] Feb 26 20:52:32 crc kubenswrapper[4722]: E0226 20:52:32.126872 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5af60092-6c8e-4807-a060-3e9e7276ac0c" containerName="oc" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.126885 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af60092-6c8e-4807-a060-3e9e7276ac0c" containerName="oc" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.127121 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5af60092-6c8e-4807-a060-3e9e7276ac0c" containerName="oc" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.128937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.138740 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85cdr"] Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.190857 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-utilities\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.191036 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-catalog-content\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.191310 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/02cdc56b-7f25-4913-af99-6dbc1449e5a6-kube-api-access-sdblb\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.293005 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-utilities\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.293168 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-catalog-content\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.293310 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/02cdc56b-7f25-4913-af99-6dbc1449e5a6-kube-api-access-sdblb\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.295304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-utilities\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.295555 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-catalog-content\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.313626 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/02cdc56b-7f25-4913-af99-6dbc1449e5a6-kube-api-access-sdblb\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.450305 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:33 crc kubenswrapper[4722]: I0226 20:52:33.069048 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85cdr"] Feb 26 20:52:33 crc kubenswrapper[4722]: I0226 20:52:33.410756 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerStarted","Data":"799e21c619bc258b0a5dcf5f3e643dec288ae9855dab71b36564153e22b2de0b"} Feb 26 20:52:34 crc kubenswrapper[4722]: I0226 20:52:34.421028 4722 generic.go:334] "Generic (PLEG): container finished" podID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerID="5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6" exitCode=0 Feb 26 20:52:34 crc kubenswrapper[4722]: I0226 20:52:34.421151 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerDied","Data":"5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6"} Feb 26 20:52:35 crc kubenswrapper[4722]: I0226 20:52:35.433687 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerStarted","Data":"e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5"} Feb 26 20:52:37 crc kubenswrapper[4722]: I0226 20:52:37.452240 4722 generic.go:334] "Generic (PLEG): container finished" podID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerID="e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5" exitCode=0 Feb 26 20:52:37 crc kubenswrapper[4722]: I0226 20:52:37.452344 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerDied","Data":"e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5"} Feb 26 20:52:38 crc kubenswrapper[4722]: I0226 20:52:38.463752 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerStarted","Data":"8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151"} Feb 26 20:52:38 crc kubenswrapper[4722]: I0226 20:52:38.484650 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-85cdr" podStartSLOduration=3.081155483 podStartE2EDuration="6.484632744s" podCreationTimestamp="2026-02-26 20:52:32 +0000 UTC" firstStartedPulling="2026-02-26 20:52:34.422753514 +0000 UTC m=+3496.959721438" lastFinishedPulling="2026-02-26 20:52:37.826230775 +0000 UTC m=+3500.363198699" observedRunningTime="2026-02-26 20:52:38.480895764 +0000 UTC m=+3501.017863708" watchObservedRunningTime="2026-02-26 20:52:38.484632744 +0000 UTC m=+3501.021600668" Feb 26 20:52:42 crc kubenswrapper[4722]: I0226 20:52:42.451289 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:42 crc kubenswrapper[4722]: I0226 20:52:42.451830 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:42 crc kubenswrapper[4722]: I0226 20:52:42.504705 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:52 crc kubenswrapper[4722]: I0226 20:52:52.502236 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:52 crc kubenswrapper[4722]: I0226 20:52:52.564438 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85cdr"] Feb 26 20:52:52 crc kubenswrapper[4722]: I0226 20:52:52.585915 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-85cdr" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="registry-server" containerID="cri-o://8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151" gracePeriod=2 Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.284346 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.371098 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/02cdc56b-7f25-4913-af99-6dbc1449e5a6-kube-api-access-sdblb\") pod \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.371636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-utilities\") pod \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.371709 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-catalog-content\") pod \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.372629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-utilities" (OuterVolumeSpecName: "utilities") pod "02cdc56b-7f25-4913-af99-6dbc1449e5a6" (UID: "02cdc56b-7f25-4913-af99-6dbc1449e5a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.379354 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02cdc56b-7f25-4913-af99-6dbc1449e5a6-kube-api-access-sdblb" (OuterVolumeSpecName: "kube-api-access-sdblb") pod "02cdc56b-7f25-4913-af99-6dbc1449e5a6" (UID: "02cdc56b-7f25-4913-af99-6dbc1449e5a6"). InnerVolumeSpecName "kube-api-access-sdblb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.420711 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02cdc56b-7f25-4913-af99-6dbc1449e5a6" (UID: "02cdc56b-7f25-4913-af99-6dbc1449e5a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.473719 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/02cdc56b-7f25-4913-af99-6dbc1449e5a6-kube-api-access-sdblb\") on node \"crc\" DevicePath \"\"" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.473758 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.473768 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.597368 4722 generic.go:334] "Generic (PLEG): container finished" podID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerID="8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151" exitCode=0 Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.597418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerDied","Data":"8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151"} Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.597447 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerDied","Data":"799e21c619bc258b0a5dcf5f3e643dec288ae9855dab71b36564153e22b2de0b"} Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.597465 4722 scope.go:117] "RemoveContainer" containerID="8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.597595 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.628030 4722 scope.go:117] "RemoveContainer" containerID="e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.638163 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85cdr"] Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.650571 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-85cdr"] Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.657529 4722 scope.go:117] "RemoveContainer" containerID="5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.713631 4722 scope.go:117] "RemoveContainer" containerID="8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151" Feb 26 20:52:53 crc kubenswrapper[4722]: E0226 20:52:53.714117 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151\": container with ID starting with 8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151 not found: ID does not exist" containerID="8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.714184 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151"} err="failed to get container status \"8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151\": rpc error: code = NotFound desc = could not find container \"8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151\": container with ID starting with 8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151 not found: ID does not exist" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.714211 4722 scope.go:117] "RemoveContainer" containerID="e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5" Feb 26 20:52:53 crc kubenswrapper[4722]: E0226 20:52:53.714574 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5\": container with ID starting with e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5 not found: ID does not exist" containerID="e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.714608 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5"} err="failed to get container status \"e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5\": rpc error: code = NotFound desc = could not find container \"e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5\": container with ID starting with e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5 not found: ID does not exist" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.714623 4722 scope.go:117] "RemoveContainer" containerID="5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6" Feb 26 20:52:53 crc kubenswrapper[4722]: E0226 20:52:53.714823 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6\": container with ID starting with 5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6 not found: ID does not exist" containerID="5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.714843 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6"} err="failed to get container status \"5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6\": rpc error: code = NotFound desc = could not find container \"5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6\": container with ID starting with 5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6 not found: ID does not exist" Feb 26 20:52:54 crc kubenswrapper[4722]: I0226 20:52:54.157970 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" path="/var/lib/kubelet/pods/02cdc56b-7f25-4913-af99-6dbc1449e5a6/volumes" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.293680 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fhlw6"] Feb 26 20:53:48 crc kubenswrapper[4722]: E0226 20:53:48.294826 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="extract-content" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.294843 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="extract-content" Feb 26 20:53:48 crc kubenswrapper[4722]: E0226 20:53:48.294867 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="extract-utilities" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.294875 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="extract-utilities" Feb 26 20:53:48 crc kubenswrapper[4722]: E0226 20:53:48.294902 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="registry-server" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.294910 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="registry-server" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.295233 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="registry-server" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.297309 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.310449 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fhlw6"] Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.480754 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-catalog-content\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.480839 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-utilities\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.481714 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86q9t\" (UniqueName: \"kubernetes.io/projected/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-kube-api-access-86q9t\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.583369 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-catalog-content\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.583482 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-utilities\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.583534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86q9t\" (UniqueName: \"kubernetes.io/projected/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-kube-api-access-86q9t\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.584062 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-utilities\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.584233 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-catalog-content\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.605534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86q9t\" (UniqueName: \"kubernetes.io/projected/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-kube-api-access-86q9t\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.662479 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:49 crc kubenswrapper[4722]: W0226 20:53:49.169365 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72dcd915_0f3c_40d6_bf29_a4c2aba237ab.slice/crio-aeffb891946fa882031429a3a6c3830495c27bc618f0e76cffb050496de560cf WatchSource:0}: Error finding container aeffb891946fa882031429a3a6c3830495c27bc618f0e76cffb050496de560cf: Status 404 returned error can't find the container with id aeffb891946fa882031429a3a6c3830495c27bc618f0e76cffb050496de560cf Feb 26 20:53:49 crc kubenswrapper[4722]: I0226 20:53:49.170022 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fhlw6"] Feb 26 20:53:50 crc kubenswrapper[4722]: I0226 20:53:50.185389 4722 generic.go:334] "Generic (PLEG): container finished" podID="72dcd915-0f3c-40d6-bf29-a4c2aba237ab" containerID="f1f85f85e2afa57b83297619ee799deeda8a7723a85d756585cc88451699cc35" exitCode=0 Feb 26 20:53:50 crc kubenswrapper[4722]: I0226 20:53:50.185579 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhlw6" event={"ID":"72dcd915-0f3c-40d6-bf29-a4c2aba237ab","Type":"ContainerDied","Data":"f1f85f85e2afa57b83297619ee799deeda8a7723a85d756585cc88451699cc35"} Feb 26 20:53:50 crc kubenswrapper[4722]: I0226 20:53:50.185656 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhlw6" event={"ID":"72dcd915-0f3c-40d6-bf29-a4c2aba237ab","Type":"ContainerStarted","Data":"aeffb891946fa882031429a3a6c3830495c27bc618f0e76cffb050496de560cf"} Feb 26 20:53:50 crc kubenswrapper[4722]: I0226 20:53:50.187342 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:53:53 crc kubenswrapper[4722]: I0226 20:53:53.487725 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:53:53 crc kubenswrapper[4722]: I0226 20:53:53.488456 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:53:55 crc kubenswrapper[4722]: I0226 20:53:55.247085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhlw6" event={"ID":"72dcd915-0f3c-40d6-bf29-a4c2aba237ab","Type":"ContainerStarted","Data":"8c4d955513125e2050b3101605d3840f995f4837209057e919faeeec6b70765e"} Feb 26 20:53:56 crc kubenswrapper[4722]: I0226 20:53:56.259122 4722 generic.go:334] "Generic (PLEG): container finished" podID="72dcd915-0f3c-40d6-bf29-a4c2aba237ab" containerID="8c4d955513125e2050b3101605d3840f995f4837209057e919faeeec6b70765e" exitCode=0 Feb 26 20:53:56 crc kubenswrapper[4722]: I0226 20:53:56.259172 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhlw6" event={"ID":"72dcd915-0f3c-40d6-bf29-a4c2aba237ab","Type":"ContainerDied","Data":"8c4d955513125e2050b3101605d3840f995f4837209057e919faeeec6b70765e"} Feb 26 20:53:57 crc kubenswrapper[4722]: I0226 20:53:57.271805 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhlw6" event={"ID":"72dcd915-0f3c-40d6-bf29-a4c2aba237ab","Type":"ContainerStarted","Data":"d2620cda545bcda024fd7c454cfce56bd5634f296b6ec955d48325e5b2f04ade"} Feb 26 20:53:57 crc kubenswrapper[4722]: I0226 20:53:57.295067 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fhlw6" podStartSLOduration=2.7875811539999997 podStartE2EDuration="9.295047661s" podCreationTimestamp="2026-02-26 20:53:48 +0000 UTC" firstStartedPulling="2026-02-26 20:53:50.187150071 +0000 UTC m=+3572.724117995" lastFinishedPulling="2026-02-26 20:53:56.694616578 +0000 UTC m=+3579.231584502" observedRunningTime="2026-02-26 20:53:57.288910135 +0000 UTC m=+3579.825878059" watchObservedRunningTime="2026-02-26 20:53:57.295047661 +0000 UTC m=+3579.832015585" Feb 26 20:53:58 crc kubenswrapper[4722]: I0226 20:53:58.664920 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:58 crc kubenswrapper[4722]: I0226 20:53:58.665339 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:58 crc kubenswrapper[4722]: I0226 20:53:58.712242 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.159218 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535654-lbd76"] Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.160937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.162093 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535654-lbd76"] Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.162484 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.163878 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.164019 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.242703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmc8j\" (UniqueName: \"kubernetes.io/projected/15a0c50e-716b-4b9a-9a95-955e01050f2b-kube-api-access-tmc8j\") pod \"auto-csr-approver-29535654-lbd76\" (UID: \"15a0c50e-716b-4b9a-9a95-955e01050f2b\") " pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.344982 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmc8j\" (UniqueName: \"kubernetes.io/projected/15a0c50e-716b-4b9a-9a95-955e01050f2b-kube-api-access-tmc8j\") pod \"auto-csr-approver-29535654-lbd76\" (UID: \"15a0c50e-716b-4b9a-9a95-955e01050f2b\") " pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.368156 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmc8j\" (UniqueName: \"kubernetes.io/projected/15a0c50e-716b-4b9a-9a95-955e01050f2b-kube-api-access-tmc8j\") pod \"auto-csr-approver-29535654-lbd76\" (UID: \"15a0c50e-716b-4b9a-9a95-955e01050f2b\") " pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.497364 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:01 crc kubenswrapper[4722]: I0226 20:54:01.119101 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535654-lbd76"] Feb 26 20:54:01 crc kubenswrapper[4722]: I0226 20:54:01.306307 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535654-lbd76" event={"ID":"15a0c50e-716b-4b9a-9a95-955e01050f2b","Type":"ContainerStarted","Data":"f12816f3d9ca34e870006089128eee519791bbe2000fb4bd80a660750ad9bd59"} Feb 26 20:54:03 crc kubenswrapper[4722]: I0226 20:54:03.363171 4722 generic.go:334] "Generic (PLEG): container finished" podID="15a0c50e-716b-4b9a-9a95-955e01050f2b" containerID="3b778151619cea3780873ac1d65406d6af5a1408cd8d0231ccc9ebb2e8538352" exitCode=0 Feb 26 20:54:03 crc kubenswrapper[4722]: I0226 20:54:03.363620 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535654-lbd76" event={"ID":"15a0c50e-716b-4b9a-9a95-955e01050f2b","Type":"ContainerDied","Data":"3b778151619cea3780873ac1d65406d6af5a1408cd8d0231ccc9ebb2e8538352"} Feb 26 20:54:04 crc kubenswrapper[4722]: I0226 20:54:04.941951 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:05 crc kubenswrapper[4722]: I0226 20:54:05.047578 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmc8j\" (UniqueName: \"kubernetes.io/projected/15a0c50e-716b-4b9a-9a95-955e01050f2b-kube-api-access-tmc8j\") pod \"15a0c50e-716b-4b9a-9a95-955e01050f2b\" (UID: \"15a0c50e-716b-4b9a-9a95-955e01050f2b\") " Feb 26 20:54:05 crc kubenswrapper[4722]: I0226 20:54:05.055266 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a0c50e-716b-4b9a-9a95-955e01050f2b-kube-api-access-tmc8j" (OuterVolumeSpecName: "kube-api-access-tmc8j") pod "15a0c50e-716b-4b9a-9a95-955e01050f2b" (UID: "15a0c50e-716b-4b9a-9a95-955e01050f2b"). InnerVolumeSpecName "kube-api-access-tmc8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:54:05 crc kubenswrapper[4722]: I0226 20:54:05.150189 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmc8j\" (UniqueName: \"kubernetes.io/projected/15a0c50e-716b-4b9a-9a95-955e01050f2b-kube-api-access-tmc8j\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:05 crc kubenswrapper[4722]: I0226 20:54:05.384086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535654-lbd76" event={"ID":"15a0c50e-716b-4b9a-9a95-955e01050f2b","Type":"ContainerDied","Data":"f12816f3d9ca34e870006089128eee519791bbe2000fb4bd80a660750ad9bd59"} Feb 26 20:54:05 crc kubenswrapper[4722]: I0226 20:54:05.384320 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f12816f3d9ca34e870006089128eee519791bbe2000fb4bd80a660750ad9bd59" Feb 26 20:54:05 crc kubenswrapper[4722]: I0226 20:54:05.384477 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:06 crc kubenswrapper[4722]: I0226 20:54:06.024033 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535648-4vfsj"] Feb 26 20:54:06 crc kubenswrapper[4722]: I0226 20:54:06.033651 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535648-4vfsj"] Feb 26 20:54:06 crc kubenswrapper[4722]: I0226 20:54:06.156414 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="717dda76-7ae7-403a-92e5-5e268a396d1d" path="/var/lib/kubelet/pods/717dda76-7ae7-403a-92e5-5e268a396d1d/volumes" Feb 26 20:54:08 crc kubenswrapper[4722]: I0226 20:54:08.712070 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:54:08 crc kubenswrapper[4722]: I0226 20:54:08.783005 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fhlw6"] Feb 26 20:54:08 crc kubenswrapper[4722]: I0226 20:54:08.826143 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:54:08 crc kubenswrapper[4722]: I0226 20:54:08.826410 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zkkbw" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="registry-server" containerID="cri-o://fa610d946bfd2dd7afb0707eb935208e9084b0ec5072709fac95aa3fcf9e30f3" gracePeriod=2 Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.438047 4722 generic.go:334] "Generic (PLEG): container finished" podID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerID="fa610d946bfd2dd7afb0707eb935208e9084b0ec5072709fac95aa3fcf9e30f3" exitCode=0 Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.439219 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerDied","Data":"fa610d946bfd2dd7afb0707eb935208e9084b0ec5072709fac95aa3fcf9e30f3"} Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.439282 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerDied","Data":"c68b9d20c118dabcbc950cddc06acb8eda6efb3efba4db8205cf89eee1eba7df"} Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.439303 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c68b9d20c118dabcbc950cddc06acb8eda6efb3efba4db8205cf89eee1eba7df" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.442782 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.542070 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkcqg\" (UniqueName: \"kubernetes.io/projected/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-kube-api-access-xkcqg\") pod \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.542127 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-catalog-content\") pod \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.542341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-utilities\") pod \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.543319 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-utilities" (OuterVolumeSpecName: "utilities") pod "de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" (UID: "de1abe77-7ea4-451a-aa5d-7bd0605ebbe5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.562231 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-kube-api-access-xkcqg" (OuterVolumeSpecName: "kube-api-access-xkcqg") pod "de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" (UID: "de1abe77-7ea4-451a-aa5d-7bd0605ebbe5"). InnerVolumeSpecName "kube-api-access-xkcqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.607401 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" (UID: "de1abe77-7ea4-451a-aa5d-7bd0605ebbe5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.644225 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkcqg\" (UniqueName: \"kubernetes.io/projected/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-kube-api-access-xkcqg\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.644264 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.644273 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:10 crc kubenswrapper[4722]: I0226 20:54:10.446002 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:54:10 crc kubenswrapper[4722]: I0226 20:54:10.475907 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:54:10 crc kubenswrapper[4722]: I0226 20:54:10.490582 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:54:12 crc kubenswrapper[4722]: I0226 20:54:12.157489 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" path="/var/lib/kubelet/pods/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5/volumes" Feb 26 20:54:17 crc kubenswrapper[4722]: I0226 20:54:17.517724 4722 generic.go:334] "Generic (PLEG): container finished" podID="48c7de81-f528-48d3-bb95-99a9cf36f43f" containerID="bb39e7c551f11ccbf11e09ef8dc147a3877dc5e00083656711dfda7be5502b23" exitCode=0 Feb 26 20:54:17 crc kubenswrapper[4722]: I0226 20:54:17.517818 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"48c7de81-f528-48d3-bb95-99a9cf36f43f","Type":"ContainerDied","Data":"bb39e7c551f11ccbf11e09ef8dc147a3877dc5e00083656711dfda7be5502b23"} Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.139459 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236031 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-workdir\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236084 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config-secret\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236129 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236206 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ssh-key\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236276 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-temporary\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236384 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ca-certs\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236464 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwmln\" (UniqueName: \"kubernetes.io/projected/48c7de81-f528-48d3-bb95-99a9cf36f43f-kube-api-access-fwmln\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236489 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-config-data\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236529 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.237781 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.237955 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-config-data" (OuterVolumeSpecName: "config-data") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.242977 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.243377 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c7de81-f528-48d3-bb95-99a9cf36f43f-kube-api-access-fwmln" (OuterVolumeSpecName: "kube-api-access-fwmln") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "kube-api-access-fwmln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.268029 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.274368 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.276362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.291701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.338918 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.338954 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.338964 4722 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.338976 4722 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.338985 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwmln\" (UniqueName: \"kubernetes.io/projected/48c7de81-f528-48d3-bb95-99a9cf36f43f-kube-api-access-fwmln\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.338996 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.339005 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.339015 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.394405 4722 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.442858 4722 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.540905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"48c7de81-f528-48d3-bb95-99a9cf36f43f","Type":"ContainerDied","Data":"01d0174ea6131e31d193c901e0ddff2a98cffbc5208903ad8ffd8e9d84dd7e77"} Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.540949 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01d0174ea6131e31d193c901e0ddff2a98cffbc5208903ad8ffd8e9d84dd7e77" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.540974 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.628885 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.647791 4722 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:21.999851 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 26 20:54:22 crc kubenswrapper[4722]: E0226 20:54:22.000589 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="extract-content" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000602 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="extract-content" Feb 26 20:54:22 crc kubenswrapper[4722]: E0226 20:54:22.000619 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a0c50e-716b-4b9a-9a95-955e01050f2b" containerName="oc" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000625 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a0c50e-716b-4b9a-9a95-955e01050f2b" containerName="oc" Feb 26 20:54:22 crc kubenswrapper[4722]: E0226 20:54:22.000636 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c7de81-f528-48d3-bb95-99a9cf36f43f" containerName="tempest-tests-tempest-tests-runner" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000644 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c7de81-f528-48d3-bb95-99a9cf36f43f" containerName="tempest-tests-tempest-tests-runner" Feb 26 20:54:22 crc kubenswrapper[4722]: E0226 20:54:22.000661 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="registry-server" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000668 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="registry-server" Feb 26 20:54:22 crc kubenswrapper[4722]: E0226 20:54:22.000678 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="extract-utilities" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000685 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="extract-utilities" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000911 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="registry-server" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000928 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c7de81-f528-48d3-bb95-99a9cf36f43f" containerName="tempest-tests-tempest-tests-runner" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000939 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a0c50e-716b-4b9a-9a95-955e01050f2b" containerName="oc" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.001667 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.004795 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dvrdk" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.009361 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.105514 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvvxj\" (UniqueName: \"kubernetes.io/projected/e14dbf76-7427-43f1-a3b5-e94661bab656-kube-api-access-mvvxj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.105583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.208638 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvvxj\" (UniqueName: \"kubernetes.io/projected/e14dbf76-7427-43f1-a3b5-e94661bab656-kube-api-access-mvvxj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.208800 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.209382 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.239873 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvvxj\" (UniqueName: \"kubernetes.io/projected/e14dbf76-7427-43f1-a3b5-e94661bab656-kube-api-access-mvvxj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.250784 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.332041 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.786290 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 26 20:54:23 crc kubenswrapper[4722]: I0226 20:54:23.487693 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:54:23 crc kubenswrapper[4722]: I0226 20:54:23.488039 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:54:23 crc kubenswrapper[4722]: I0226 20:54:23.583735 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e14dbf76-7427-43f1-a3b5-e94661bab656","Type":"ContainerStarted","Data":"937412b5a65803bc5fc30927d577975659b18c11d2306839555505773c512c90"} Feb 26 20:54:24 crc kubenswrapper[4722]: I0226 20:54:24.593310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e14dbf76-7427-43f1-a3b5-e94661bab656","Type":"ContainerStarted","Data":"df698c948fada19dca3a906a6f224b0e5034ed7bbba8a209e08580371c587202"} Feb 26 20:54:24 crc kubenswrapper[4722]: I0226 20:54:24.614501 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.60760123 podStartE2EDuration="3.614484496s" podCreationTimestamp="2026-02-26 20:54:21 +0000 UTC" firstStartedPulling="2026-02-26 20:54:22.791121938 +0000 UTC m=+3605.328089862" lastFinishedPulling="2026-02-26 20:54:23.798005204 +0000 UTC m=+3606.334973128" observedRunningTime="2026-02-26 20:54:24.6075912 +0000 UTC m=+3607.144559134" watchObservedRunningTime="2026-02-26 20:54:24.614484496 +0000 UTC m=+3607.151452420" Feb 26 20:54:29 crc kubenswrapper[4722]: I0226 20:54:29.149070 4722 scope.go:117] "RemoveContainer" containerID="1c090f236672e9878bb5c5bf9aaaeb7db1a4e06a69d92511bbd1f90fae3446a5" Feb 26 20:54:29 crc kubenswrapper[4722]: I0226 20:54:29.193790 4722 scope.go:117] "RemoveContainer" containerID="de95ee95874ad60fdba50e28315671814250f49e2d745be47a8b1c43ec87dd12" Feb 26 20:54:29 crc kubenswrapper[4722]: I0226 20:54:29.249409 4722 scope.go:117] "RemoveContainer" containerID="75c0dbcfd458093bfc0e2eb7ba887e489cabe2151aed3040d797e05145938e83" Feb 26 20:54:29 crc kubenswrapper[4722]: I0226 20:54:29.311161 4722 scope.go:117] "RemoveContainer" containerID="fa610d946bfd2dd7afb0707eb935208e9084b0ec5072709fac95aa3fcf9e30f3" Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.487360 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.487887 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.487929 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.488654 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.488706 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" gracePeriod=600 Feb 26 20:54:53 crc kubenswrapper[4722]: E0226 20:54:53.615766 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.892957 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" exitCode=0 Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.893009 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6"} Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.893331 4722 scope.go:117] "RemoveContainer" containerID="29b1b3ce04e03488d1e4fef03dfbf65ce74330e0045117dcf412a77a31f455fc" Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.894032 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:54:53 crc kubenswrapper[4722]: E0226 20:54:53.894374 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.386730 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v9pkb/must-gather-cl4sw"] Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.388870 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.394454 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v9pkb"/"kube-root-ca.crt" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.394644 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v9pkb"/"default-dockercfg-tmqgl" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.395159 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v9pkb"/"openshift-service-ca.crt" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.407965 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v9pkb/must-gather-cl4sw"] Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.547119 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-must-gather-output\") pod \"must-gather-cl4sw\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.547298 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4m9p\" (UniqueName: \"kubernetes.io/projected/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-kube-api-access-s4m9p\") pod \"must-gather-cl4sw\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.649658 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-must-gather-output\") pod \"must-gather-cl4sw\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.649762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4m9p\" (UniqueName: \"kubernetes.io/projected/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-kube-api-access-s4m9p\") pod \"must-gather-cl4sw\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.650234 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-must-gather-output\") pod \"must-gather-cl4sw\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.681991 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4m9p\" (UniqueName: \"kubernetes.io/projected/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-kube-api-access-s4m9p\") pod \"must-gather-cl4sw\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.715617 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:56 crc kubenswrapper[4722]: I0226 20:54:56.276944 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v9pkb/must-gather-cl4sw"] Feb 26 20:54:56 crc kubenswrapper[4722]: I0226 20:54:56.980220 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" event={"ID":"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef","Type":"ContainerStarted","Data":"56297e44749472fd3d3f378315609829c7aa0213b9edc872dec78f92930f813f"} Feb 26 20:55:04 crc kubenswrapper[4722]: I0226 20:55:04.077974 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" event={"ID":"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef","Type":"ContainerStarted","Data":"cf859242e6b85e5c5ff11aa2779a6c6b726c1832d4fe61146bc5fec28bde1fba"} Feb 26 20:55:04 crc kubenswrapper[4722]: I0226 20:55:04.078528 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" event={"ID":"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef","Type":"ContainerStarted","Data":"6d0bdad11f63ded72de1a9fdfd2f5219a998239c5f98345a96df88beb067d8df"} Feb 26 20:55:04 crc kubenswrapper[4722]: I0226 20:55:04.106040 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" podStartSLOduration=2.347172627 podStartE2EDuration="9.106024608s" podCreationTimestamp="2026-02-26 20:54:55 +0000 UTC" firstStartedPulling="2026-02-26 20:54:56.279944407 +0000 UTC m=+3638.816912331" lastFinishedPulling="2026-02-26 20:55:03.038796388 +0000 UTC m=+3645.575764312" observedRunningTime="2026-02-26 20:55:04.098917106 +0000 UTC m=+3646.635885030" watchObservedRunningTime="2026-02-26 20:55:04.106024608 +0000 UTC m=+3646.642992532" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.114737 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-x9rbt"] Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.117068 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.262190 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flvkm\" (UniqueName: \"kubernetes.io/projected/978d6489-4c20-4492-91e8-528a0e0715ba-kube-api-access-flvkm\") pod \"crc-debug-x9rbt\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.262521 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978d6489-4c20-4492-91e8-528a0e0715ba-host\") pod \"crc-debug-x9rbt\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.364585 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978d6489-4c20-4492-91e8-528a0e0715ba-host\") pod \"crc-debug-x9rbt\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.364723 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978d6489-4c20-4492-91e8-528a0e0715ba-host\") pod \"crc-debug-x9rbt\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.364743 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flvkm\" (UniqueName: \"kubernetes.io/projected/978d6489-4c20-4492-91e8-528a0e0715ba-kube-api-access-flvkm\") pod \"crc-debug-x9rbt\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.390912 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flvkm\" (UniqueName: \"kubernetes.io/projected/978d6489-4c20-4492-91e8-528a0e0715ba-kube-api-access-flvkm\") pod \"crc-debug-x9rbt\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.437802 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:08 crc kubenswrapper[4722]: I0226 20:55:08.128690 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" event={"ID":"978d6489-4c20-4492-91e8-528a0e0715ba","Type":"ContainerStarted","Data":"b4578c4970c903456e9179d2949c6dae962e604849a730357aa082554d6d7a42"} Feb 26 20:55:09 crc kubenswrapper[4722]: I0226 20:55:09.146402 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:55:09 crc kubenswrapper[4722]: E0226 20:55:09.147096 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:55:21 crc kubenswrapper[4722]: I0226 20:55:21.333923 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" event={"ID":"978d6489-4c20-4492-91e8-528a0e0715ba","Type":"ContainerStarted","Data":"b96ab482157a01b485510d5127ce825cdd3b6d82cdcbe56e073e5d108a61889c"} Feb 26 20:55:21 crc kubenswrapper[4722]: I0226 20:55:21.357719 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" podStartSLOduration=0.79173963 podStartE2EDuration="14.357695878s" podCreationTimestamp="2026-02-26 20:55:07 +0000 UTC" firstStartedPulling="2026-02-26 20:55:07.488355617 +0000 UTC m=+3650.025323541" lastFinishedPulling="2026-02-26 20:55:21.054311865 +0000 UTC m=+3663.591279789" observedRunningTime="2026-02-26 20:55:21.353410532 +0000 UTC m=+3663.890378476" watchObservedRunningTime="2026-02-26 20:55:21.357695878 +0000 UTC m=+3663.894663822" Feb 26 20:55:23 crc kubenswrapper[4722]: I0226 20:55:23.146760 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:55:23 crc kubenswrapper[4722]: E0226 20:55:23.147570 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.712645 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t99jk"] Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.715291 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.745153 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t99jk"] Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.863052 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvvmt\" (UniqueName: \"kubernetes.io/projected/bb237572-01ab-46ff-b1ca-6ce751086707-kube-api-access-kvvmt\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.863616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-utilities\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.863727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-catalog-content\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.972309 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-catalog-content\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.972472 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvvmt\" (UniqueName: \"kubernetes.io/projected/bb237572-01ab-46ff-b1ca-6ce751086707-kube-api-access-kvvmt\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.972770 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-utilities\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.972781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-catalog-content\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.973405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-utilities\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.993405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvvmt\" (UniqueName: \"kubernetes.io/projected/bb237572-01ab-46ff-b1ca-6ce751086707-kube-api-access-kvvmt\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:32 crc kubenswrapper[4722]: I0226 20:55:32.041921 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:32 crc kubenswrapper[4722]: I0226 20:55:32.692054 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t99jk"] Feb 26 20:55:33 crc kubenswrapper[4722]: I0226 20:55:33.462629 4722 generic.go:334] "Generic (PLEG): container finished" podID="bb237572-01ab-46ff-b1ca-6ce751086707" containerID="97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a" exitCode=0 Feb 26 20:55:33 crc kubenswrapper[4722]: I0226 20:55:33.463111 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerDied","Data":"97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a"} Feb 26 20:55:33 crc kubenswrapper[4722]: I0226 20:55:33.463177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerStarted","Data":"3046ab3ba85ac329bb4cacfe6d1dee44a612efa91d57730217f524760c68e635"} Feb 26 20:55:34 crc kubenswrapper[4722]: I0226 20:55:34.472228 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerStarted","Data":"a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282"} Feb 26 20:55:36 crc kubenswrapper[4722]: I0226 20:55:36.497999 4722 generic.go:334] "Generic (PLEG): container finished" podID="bb237572-01ab-46ff-b1ca-6ce751086707" containerID="a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282" exitCode=0 Feb 26 20:55:36 crc kubenswrapper[4722]: I0226 20:55:36.498378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerDied","Data":"a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282"} Feb 26 20:55:37 crc kubenswrapper[4722]: I0226 20:55:37.146277 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:55:37 crc kubenswrapper[4722]: E0226 20:55:37.146876 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:55:37 crc kubenswrapper[4722]: I0226 20:55:37.517102 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerStarted","Data":"e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5"} Feb 26 20:55:37 crc kubenswrapper[4722]: I0226 20:55:37.544189 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t99jk" podStartSLOduration=2.829465396 podStartE2EDuration="6.544170033s" podCreationTimestamp="2026-02-26 20:55:31 +0000 UTC" firstStartedPulling="2026-02-26 20:55:33.46607459 +0000 UTC m=+3676.003042514" lastFinishedPulling="2026-02-26 20:55:37.180779227 +0000 UTC m=+3679.717747151" observedRunningTime="2026-02-26 20:55:37.537532494 +0000 UTC m=+3680.074500418" watchObservedRunningTime="2026-02-26 20:55:37.544170033 +0000 UTC m=+3680.081137957" Feb 26 20:55:42 crc kubenswrapper[4722]: I0226 20:55:42.042461 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:42 crc kubenswrapper[4722]: I0226 20:55:42.044236 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:42 crc kubenswrapper[4722]: I0226 20:55:42.119662 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:42 crc kubenswrapper[4722]: I0226 20:55:42.619904 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:42 crc kubenswrapper[4722]: I0226 20:55:42.675690 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t99jk"] Feb 26 20:55:44 crc kubenswrapper[4722]: I0226 20:55:44.588629 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t99jk" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="registry-server" containerID="cri-o://e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5" gracePeriod=2 Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.287480 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.482840 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-catalog-content\") pod \"bb237572-01ab-46ff-b1ca-6ce751086707\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.482978 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvvmt\" (UniqueName: \"kubernetes.io/projected/bb237572-01ab-46ff-b1ca-6ce751086707-kube-api-access-kvvmt\") pod \"bb237572-01ab-46ff-b1ca-6ce751086707\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.483422 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-utilities\") pod \"bb237572-01ab-46ff-b1ca-6ce751086707\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.483861 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-utilities" (OuterVolumeSpecName: "utilities") pod "bb237572-01ab-46ff-b1ca-6ce751086707" (UID: "bb237572-01ab-46ff-b1ca-6ce751086707"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.484105 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.498350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb237572-01ab-46ff-b1ca-6ce751086707-kube-api-access-kvvmt" (OuterVolumeSpecName: "kube-api-access-kvvmt") pod "bb237572-01ab-46ff-b1ca-6ce751086707" (UID: "bb237572-01ab-46ff-b1ca-6ce751086707"). InnerVolumeSpecName "kube-api-access-kvvmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.533894 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb237572-01ab-46ff-b1ca-6ce751086707" (UID: "bb237572-01ab-46ff-b1ca-6ce751086707"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.586610 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvvmt\" (UniqueName: \"kubernetes.io/projected/bb237572-01ab-46ff-b1ca-6ce751086707-kube-api-access-kvvmt\") on node \"crc\" DevicePath \"\"" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.586646 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.601638 4722 generic.go:334] "Generic (PLEG): container finished" podID="bb237572-01ab-46ff-b1ca-6ce751086707" containerID="e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5" exitCode=0 Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.601692 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerDied","Data":"e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5"} Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.601723 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerDied","Data":"3046ab3ba85ac329bb4cacfe6d1dee44a612efa91d57730217f524760c68e635"} Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.601746 4722 scope.go:117] "RemoveContainer" containerID="e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.601899 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.626423 4722 scope.go:117] "RemoveContainer" containerID="a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.681216 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t99jk"] Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.685458 4722 scope.go:117] "RemoveContainer" containerID="97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.702774 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t99jk"] Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.708595 4722 scope.go:117] "RemoveContainer" containerID="e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5" Feb 26 20:55:45 crc kubenswrapper[4722]: E0226 20:55:45.711573 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5\": container with ID starting with e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5 not found: ID does not exist" containerID="e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.711680 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5"} err="failed to get container status \"e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5\": rpc error: code = NotFound desc = could not find container \"e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5\": container with ID starting with e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5 not found: ID does not exist" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.711739 4722 scope.go:117] "RemoveContainer" containerID="a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282" Feb 26 20:55:45 crc kubenswrapper[4722]: E0226 20:55:45.721061 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282\": container with ID starting with a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282 not found: ID does not exist" containerID="a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.721119 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282"} err="failed to get container status \"a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282\": rpc error: code = NotFound desc = could not find container \"a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282\": container with ID starting with a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282 not found: ID does not exist" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.721198 4722 scope.go:117] "RemoveContainer" containerID="97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a" Feb 26 20:55:45 crc kubenswrapper[4722]: E0226 20:55:45.723094 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a\": container with ID starting with 97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a not found: ID does not exist" containerID="97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.723149 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a"} err="failed to get container status \"97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a\": rpc error: code = NotFound desc = could not find container \"97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a\": container with ID starting with 97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a not found: ID does not exist" Feb 26 20:55:46 crc kubenswrapper[4722]: I0226 20:55:46.158301 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" path="/var/lib/kubelet/pods/bb237572-01ab-46ff-b1ca-6ce751086707/volumes" Feb 26 20:55:52 crc kubenswrapper[4722]: I0226 20:55:52.146717 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:55:52 crc kubenswrapper[4722]: E0226 20:55:52.147751 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.155957 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535656-x5kvk"] Feb 26 20:56:00 crc kubenswrapper[4722]: E0226 20:56:00.156996 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="extract-utilities" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.157011 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="extract-utilities" Feb 26 20:56:00 crc kubenswrapper[4722]: E0226 20:56:00.157051 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="extract-content" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.157059 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="extract-content" Feb 26 20:56:00 crc kubenswrapper[4722]: E0226 20:56:00.157081 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="registry-server" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.157088 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="registry-server" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.157308 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="registry-server" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.158028 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.159713 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535656-x5kvk"] Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.160467 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.164022 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.169405 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.285219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffmgc\" (UniqueName: \"kubernetes.io/projected/6d81e072-7a00-4b3c-b823-692d3817a4a6-kube-api-access-ffmgc\") pod \"auto-csr-approver-29535656-x5kvk\" (UID: \"6d81e072-7a00-4b3c-b823-692d3817a4a6\") " pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.387516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmgc\" (UniqueName: \"kubernetes.io/projected/6d81e072-7a00-4b3c-b823-692d3817a4a6-kube-api-access-ffmgc\") pod \"auto-csr-approver-29535656-x5kvk\" (UID: \"6d81e072-7a00-4b3c-b823-692d3817a4a6\") " pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.405830 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmgc\" (UniqueName: \"kubernetes.io/projected/6d81e072-7a00-4b3c-b823-692d3817a4a6-kube-api-access-ffmgc\") pod \"auto-csr-approver-29535656-x5kvk\" (UID: \"6d81e072-7a00-4b3c-b823-692d3817a4a6\") " pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.497830 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:01 crc kubenswrapper[4722]: I0226 20:56:01.032596 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535656-x5kvk"] Feb 26 20:56:01 crc kubenswrapper[4722]: I0226 20:56:01.753392 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" event={"ID":"6d81e072-7a00-4b3c-b823-692d3817a4a6","Type":"ContainerStarted","Data":"9f9a82d5abcc82594e8fc8c15feb77193bcc6a49dbe1e215d3100dc5b50a8ef6"} Feb 26 20:56:02 crc kubenswrapper[4722]: I0226 20:56:02.763921 4722 generic.go:334] "Generic (PLEG): container finished" podID="6d81e072-7a00-4b3c-b823-692d3817a4a6" containerID="832d0e1df420009c53cd27587c0296e2650039bcfaf51e81797c2e554d229c02" exitCode=0 Feb 26 20:56:02 crc kubenswrapper[4722]: I0226 20:56:02.764013 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" event={"ID":"6d81e072-7a00-4b3c-b823-692d3817a4a6","Type":"ContainerDied","Data":"832d0e1df420009c53cd27587c0296e2650039bcfaf51e81797c2e554d229c02"} Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.288764 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.477740 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffmgc\" (UniqueName: \"kubernetes.io/projected/6d81e072-7a00-4b3c-b823-692d3817a4a6-kube-api-access-ffmgc\") pod \"6d81e072-7a00-4b3c-b823-692d3817a4a6\" (UID: \"6d81e072-7a00-4b3c-b823-692d3817a4a6\") " Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.484380 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d81e072-7a00-4b3c-b823-692d3817a4a6-kube-api-access-ffmgc" (OuterVolumeSpecName: "kube-api-access-ffmgc") pod "6d81e072-7a00-4b3c-b823-692d3817a4a6" (UID: "6d81e072-7a00-4b3c-b823-692d3817a4a6"). InnerVolumeSpecName "kube-api-access-ffmgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.580654 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffmgc\" (UniqueName: \"kubernetes.io/projected/6d81e072-7a00-4b3c-b823-692d3817a4a6-kube-api-access-ffmgc\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.789279 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" event={"ID":"6d81e072-7a00-4b3c-b823-692d3817a4a6","Type":"ContainerDied","Data":"9f9a82d5abcc82594e8fc8c15feb77193bcc6a49dbe1e215d3100dc5b50a8ef6"} Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.789537 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f9a82d5abcc82594e8fc8c15feb77193bcc6a49dbe1e215d3100dc5b50a8ef6" Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.789345 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:05 crc kubenswrapper[4722]: I0226 20:56:05.146735 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:56:05 crc kubenswrapper[4722]: E0226 20:56:05.147271 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:56:05 crc kubenswrapper[4722]: I0226 20:56:05.368236 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535650-zw96r"] Feb 26 20:56:05 crc kubenswrapper[4722]: I0226 20:56:05.381376 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535650-zw96r"] Feb 26 20:56:06 crc kubenswrapper[4722]: I0226 20:56:06.161208 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28757563-9165-4c96-82ec-c961b940926a" path="/var/lib/kubelet/pods/28757563-9165-4c96-82ec-c961b940926a/volumes" Feb 26 20:56:15 crc kubenswrapper[4722]: I0226 20:56:15.888013 4722 generic.go:334] "Generic (PLEG): container finished" podID="978d6489-4c20-4492-91e8-528a0e0715ba" containerID="b96ab482157a01b485510d5127ce825cdd3b6d82cdcbe56e073e5d108a61889c" exitCode=0 Feb 26 20:56:15 crc kubenswrapper[4722]: I0226 20:56:15.888101 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" event={"ID":"978d6489-4c20-4492-91e8-528a0e0715ba","Type":"ContainerDied","Data":"b96ab482157a01b485510d5127ce825cdd3b6d82cdcbe56e073e5d108a61889c"} Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.006702 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.044384 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-x9rbt"] Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.063696 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-x9rbt"] Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.133644 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flvkm\" (UniqueName: \"kubernetes.io/projected/978d6489-4c20-4492-91e8-528a0e0715ba-kube-api-access-flvkm\") pod \"978d6489-4c20-4492-91e8-528a0e0715ba\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.134302 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978d6489-4c20-4492-91e8-528a0e0715ba-host\") pod \"978d6489-4c20-4492-91e8-528a0e0715ba\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.134846 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/978d6489-4c20-4492-91e8-528a0e0715ba-host" (OuterVolumeSpecName: "host") pod "978d6489-4c20-4492-91e8-528a0e0715ba" (UID: "978d6489-4c20-4492-91e8-528a0e0715ba"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.141499 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/978d6489-4c20-4492-91e8-528a0e0715ba-kube-api-access-flvkm" (OuterVolumeSpecName: "kube-api-access-flvkm") pod "978d6489-4c20-4492-91e8-528a0e0715ba" (UID: "978d6489-4c20-4492-91e8-528a0e0715ba"). InnerVolumeSpecName "kube-api-access-flvkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.236580 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flvkm\" (UniqueName: \"kubernetes.io/projected/978d6489-4c20-4492-91e8-528a0e0715ba-kube-api-access-flvkm\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.236625 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978d6489-4c20-4492-91e8-528a0e0715ba-host\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.906829 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4578c4970c903456e9179d2949c6dae962e604849a730357aa082554d6d7a42" Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.906921 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.159388 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="978d6489-4c20-4492-91e8-528a0e0715ba" path="/var/lib/kubelet/pods/978d6489-4c20-4492-91e8-528a0e0715ba/volumes" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.238689 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-894h2"] Feb 26 20:56:18 crc kubenswrapper[4722]: E0226 20:56:18.239079 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978d6489-4c20-4492-91e8-528a0e0715ba" containerName="container-00" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.239096 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="978d6489-4c20-4492-91e8-528a0e0715ba" containerName="container-00" Feb 26 20:56:18 crc kubenswrapper[4722]: E0226 20:56:18.239120 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d81e072-7a00-4b3c-b823-692d3817a4a6" containerName="oc" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.239126 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d81e072-7a00-4b3c-b823-692d3817a4a6" containerName="oc" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.239353 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="978d6489-4c20-4492-91e8-528a0e0715ba" containerName="container-00" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.239383 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d81e072-7a00-4b3c-b823-692d3817a4a6" containerName="oc" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.240058 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.354409 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh6rw\" (UniqueName: \"kubernetes.io/projected/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-kube-api-access-bh6rw\") pod \"crc-debug-894h2\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.354679 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-host\") pod \"crc-debug-894h2\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.456049 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh6rw\" (UniqueName: \"kubernetes.io/projected/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-kube-api-access-bh6rw\") pod \"crc-debug-894h2\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.456528 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-host\") pod \"crc-debug-894h2\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.456660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-host\") pod \"crc-debug-894h2\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.475274 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh6rw\" (UniqueName: \"kubernetes.io/projected/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-kube-api-access-bh6rw\") pod \"crc-debug-894h2\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.554911 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.931515 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-894h2" event={"ID":"a97a6b0d-37d8-49ec-b882-4ff2d36cb701","Type":"ContainerStarted","Data":"bdd7c3bf285ec366272d6e3f20642936db9fcc7be861fd32d58118457e4f934f"} Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.931874 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-894h2" event={"ID":"a97a6b0d-37d8-49ec-b882-4ff2d36cb701","Type":"ContainerStarted","Data":"90432cd69ca7aefd1d9225c5a0e584da55871d6624e3086db49d174ac2d3767a"} Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.948540 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v9pkb/crc-debug-894h2" podStartSLOduration=0.948521993 podStartE2EDuration="948.521993ms" podCreationTimestamp="2026-02-26 20:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:56:18.946541851 +0000 UTC m=+3721.483509775" watchObservedRunningTime="2026-02-26 20:56:18.948521993 +0000 UTC m=+3721.485489917" Feb 26 20:56:19 crc kubenswrapper[4722]: I0226 20:56:19.145549 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:56:19 crc kubenswrapper[4722]: E0226 20:56:19.145884 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:56:19 crc kubenswrapper[4722]: I0226 20:56:19.945845 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-894h2" event={"ID":"a97a6b0d-37d8-49ec-b882-4ff2d36cb701","Type":"ContainerDied","Data":"bdd7c3bf285ec366272d6e3f20642936db9fcc7be861fd32d58118457e4f934f"} Feb 26 20:56:19 crc kubenswrapper[4722]: I0226 20:56:19.945593 4722 generic.go:334] "Generic (PLEG): container finished" podID="a97a6b0d-37d8-49ec-b882-4ff2d36cb701" containerID="bdd7c3bf285ec366272d6e3f20642936db9fcc7be861fd32d58118457e4f934f" exitCode=0 Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.061557 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.095438 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-894h2"] Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.104093 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-894h2"] Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.234854 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-host\") pod \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.234989 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-host" (OuterVolumeSpecName: "host") pod "a97a6b0d-37d8-49ec-b882-4ff2d36cb701" (UID: "a97a6b0d-37d8-49ec-b882-4ff2d36cb701"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.235110 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh6rw\" (UniqueName: \"kubernetes.io/projected/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-kube-api-access-bh6rw\") pod \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.235653 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-host\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.246411 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-kube-api-access-bh6rw" (OuterVolumeSpecName: "kube-api-access-bh6rw") pod "a97a6b0d-37d8-49ec-b882-4ff2d36cb701" (UID: "a97a6b0d-37d8-49ec-b882-4ff2d36cb701"). InnerVolumeSpecName "kube-api-access-bh6rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.338357 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh6rw\" (UniqueName: \"kubernetes.io/projected/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-kube-api-access-bh6rw\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.963804 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90432cd69ca7aefd1d9225c5a0e584da55871d6624e3086db49d174ac2d3767a" Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.963858 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.158383 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a97a6b0d-37d8-49ec-b882-4ff2d36cb701" path="/var/lib/kubelet/pods/a97a6b0d-37d8-49ec-b882-4ff2d36cb701/volumes" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.430259 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-q7ncl"] Feb 26 20:56:22 crc kubenswrapper[4722]: E0226 20:56:22.430705 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a97a6b0d-37d8-49ec-b882-4ff2d36cb701" containerName="container-00" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.430727 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97a6b0d-37d8-49ec-b882-4ff2d36cb701" containerName="container-00" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.430932 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a97a6b0d-37d8-49ec-b882-4ff2d36cb701" containerName="container-00" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.431651 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.563470 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vxkj\" (UniqueName: \"kubernetes.io/projected/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-kube-api-access-5vxkj\") pod \"crc-debug-q7ncl\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.563936 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-host\") pod \"crc-debug-q7ncl\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.666353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vxkj\" (UniqueName: \"kubernetes.io/projected/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-kube-api-access-5vxkj\") pod \"crc-debug-q7ncl\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.666394 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-host\") pod \"crc-debug-q7ncl\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.666527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-host\") pod \"crc-debug-q7ncl\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.687969 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vxkj\" (UniqueName: \"kubernetes.io/projected/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-kube-api-access-5vxkj\") pod \"crc-debug-q7ncl\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.751557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: W0226 20:56:22.787161 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f0a8c07_aefe_49d6_a4bc_3eab73cad424.slice/crio-f58248b5cfa4fa0d1f66179622ae0ae3905c0ffb2fcac1e87b76da2a8b5f60a9 WatchSource:0}: Error finding container f58248b5cfa4fa0d1f66179622ae0ae3905c0ffb2fcac1e87b76da2a8b5f60a9: Status 404 returned error can't find the container with id f58248b5cfa4fa0d1f66179622ae0ae3905c0ffb2fcac1e87b76da2a8b5f60a9 Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.974523 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" event={"ID":"1f0a8c07-aefe-49d6-a4bc-3eab73cad424","Type":"ContainerStarted","Data":"f58248b5cfa4fa0d1f66179622ae0ae3905c0ffb2fcac1e87b76da2a8b5f60a9"} Feb 26 20:56:23 crc kubenswrapper[4722]: I0226 20:56:23.992318 4722 generic.go:334] "Generic (PLEG): container finished" podID="1f0a8c07-aefe-49d6-a4bc-3eab73cad424" containerID="f7a896747b7e92ec19fdb35a9fa421577f46f5c406a9d9bce8f7d44055d1f946" exitCode=0 Feb 26 20:56:23 crc kubenswrapper[4722]: I0226 20:56:23.992394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" event={"ID":"1f0a8c07-aefe-49d6-a4bc-3eab73cad424","Type":"ContainerDied","Data":"f7a896747b7e92ec19fdb35a9fa421577f46f5c406a9d9bce8f7d44055d1f946"} Feb 26 20:56:24 crc kubenswrapper[4722]: I0226 20:56:24.036320 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-q7ncl"] Feb 26 20:56:24 crc kubenswrapper[4722]: I0226 20:56:24.046958 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-q7ncl"] Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.108360 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.218124 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-host\") pod \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.218221 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-host" (OuterVolumeSpecName: "host") pod "1f0a8c07-aefe-49d6-a4bc-3eab73cad424" (UID: "1f0a8c07-aefe-49d6-a4bc-3eab73cad424"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.218371 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vxkj\" (UniqueName: \"kubernetes.io/projected/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-kube-api-access-5vxkj\") pod \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.219044 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-host\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.226349 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-kube-api-access-5vxkj" (OuterVolumeSpecName: "kube-api-access-5vxkj") pod "1f0a8c07-aefe-49d6-a4bc-3eab73cad424" (UID: "1f0a8c07-aefe-49d6-a4bc-3eab73cad424"). InnerVolumeSpecName "kube-api-access-5vxkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.320928 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vxkj\" (UniqueName: \"kubernetes.io/projected/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-kube-api-access-5vxkj\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:26 crc kubenswrapper[4722]: I0226 20:56:26.011245 4722 scope.go:117] "RemoveContainer" containerID="f7a896747b7e92ec19fdb35a9fa421577f46f5c406a9d9bce8f7d44055d1f946" Feb 26 20:56:26 crc kubenswrapper[4722]: I0226 20:56:26.011281 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:26 crc kubenswrapper[4722]: I0226 20:56:26.158513 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f0a8c07-aefe-49d6-a4bc-3eab73cad424" path="/var/lib/kubelet/pods/1f0a8c07-aefe-49d6-a4bc-3eab73cad424/volumes" Feb 26 20:56:29 crc kubenswrapper[4722]: I0226 20:56:29.415638 4722 scope.go:117] "RemoveContainer" containerID="1fa5adcbc0e1334a1ee837169477ee86eee474e4a55644b4a1175da4b5c4547b" Feb 26 20:56:32 crc kubenswrapper[4722]: I0226 20:56:32.146174 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:56:32 crc kubenswrapper[4722]: E0226 20:56:32.146764 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:56:45 crc kubenswrapper[4722]: I0226 20:56:45.147485 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:56:45 crc kubenswrapper[4722]: E0226 20:56:45.148304 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:56:51 crc kubenswrapper[4722]: I0226 20:56:51.605382 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7/init-config-reloader/0.log" Feb 26 20:56:51 crc kubenswrapper[4722]: I0226 20:56:51.849936 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7/init-config-reloader/0.log" Feb 26 20:56:51 crc kubenswrapper[4722]: I0226 20:56:51.871894 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7/alertmanager/0.log" Feb 26 20:56:51 crc kubenswrapper[4722]: I0226 20:56:51.904372 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7/config-reloader/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.046516 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-695d67b888-54s74_eb79c8d8-0608-427d-9757-0186e5ebc504/barbican-api/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.065763 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-695d67b888-54s74_eb79c8d8-0608-427d-9757-0186e5ebc504/barbican-api-log/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.164605 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78948b6746-t9s8h_c01eeff5-0acc-4fd4-9097-9b3e8a888ccd/barbican-keystone-listener/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.334632 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78948b6746-t9s8h_c01eeff5-0acc-4fd4-9097-9b3e8a888ccd/barbican-keystone-listener-log/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.408640 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c8844bc6c-vsnhr_eba88113-0067-4ac3-873a-36e97ce5ef3b/barbican-worker-log/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.436033 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c8844bc6c-vsnhr_eba88113-0067-4ac3-873a-36e97ce5ef3b/barbican-worker/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.584321 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx_7aea65fe-4b22-44f8-b756-2ee54c916c8a/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.696583 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a07fb793-d2c8-4d0a-b04e-b6e4476f370c/ceilometer-central-agent/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.771419 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a07fb793-d2c8-4d0a-b04e-b6e4476f370c/ceilometer-notification-agent/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.795359 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a07fb793-d2c8-4d0a-b04e-b6e4476f370c/proxy-httpd/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.896063 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a07fb793-d2c8-4d0a-b04e-b6e4476f370c/sg-core/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.013160 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2805299d-4ab4-420c-aa59-bc54594053d5/cinder-api-log/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.065962 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2805299d-4ab4-420c-aa59-bc54594053d5/cinder-api/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.207639 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_116b7592-ce3d-44ff-94d9-2a16103f4058/cinder-scheduler/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.266585 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_116b7592-ce3d-44ff-94d9-2a16103f4058/probe/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.436055 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_8fb8d392-1263-4049-bb26-f832cc4526e1/cloudkitty-api-log/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.507210 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_8fb8d392-1263-4049-bb26-f832cc4526e1/cloudkitty-api/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.621627 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_a66cb8be-67f7-46f6-90c1-914129608068/loki-compactor/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.747163 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-585d9bcbc-w5dgv_b1e5ce93-d4cd-4ef0-a71b-f63165e558cb/loki-distributor/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.818925 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-4txnm_43abd91c-064b-4440-9bb9-8f9768720659/gateway/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.966853 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-rmttg_23fc144a-bb55-464d-8f21-94038bf68ecd/gateway/0.log" Feb 26 20:56:54 crc kubenswrapper[4722]: I0226 20:56:54.313649 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12/loki-index-gateway/0.log" Feb 26 20:56:54 crc kubenswrapper[4722]: I0226 20:56:54.323754 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_082c8f6a-a03f-4567-891c-56b6aa6f26d3/loki-ingester/0.log" Feb 26 20:56:54 crc kubenswrapper[4722]: I0226 20:56:54.530199 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9_734bb9a8-948b-4d5a-bdb1-df37ad791e6b/loki-query-frontend/0.log" Feb 26 20:56:54 crc kubenswrapper[4722]: I0226 20:56:54.948963 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-sz98m_19a53cda-4020-471d-a7f3-6e410ae94b65/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:55 crc kubenswrapper[4722]: I0226 20:56:55.005609 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-58c84b5844-n4b6c_1e16be72-77f7-43fb-a6bf-04088d7c6c0b/loki-querier/0.log" Feb 26 20:56:55 crc kubenswrapper[4722]: I0226 20:56:55.190781 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v_4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:55 crc kubenswrapper[4722]: I0226 20:56:55.397961 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5475ccd585-mvzh4_3065620c-5bba-4e4f-a622-151e564a3e06/init/0.log" Feb 26 20:56:55 crc kubenswrapper[4722]: I0226 20:56:55.664904 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5475ccd585-mvzh4_3065620c-5bba-4e4f-a622-151e564a3e06/init/0.log" Feb 26 20:56:55 crc kubenswrapper[4722]: I0226 20:56:55.838249 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5475ccd585-mvzh4_3065620c-5bba-4e4f-a622-151e564a3e06/dnsmasq-dns/0.log" Feb 26 20:56:55 crc kubenswrapper[4722]: I0226 20:56:55.981286 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx_8d72a53a-52c1-427e-a1be-81a00129c7bd/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:56 crc kubenswrapper[4722]: I0226 20:56:56.037657 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a45004da-d9b9-4962-a4d3-2a1175e78747/glance-log/0.log" Feb 26 20:56:56 crc kubenswrapper[4722]: I0226 20:56:56.110715 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a45004da-d9b9-4962-a4d3-2a1175e78747/glance-httpd/0.log" Feb 26 20:56:56 crc kubenswrapper[4722]: I0226 20:56:56.415258 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7a665ecb-6cf5-402f-aee1-26ebfcd9583c/glance-httpd/0.log" Feb 26 20:56:56 crc kubenswrapper[4722]: I0226 20:56:56.475436 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7a665ecb-6cf5-402f-aee1-26ebfcd9583c/glance-log/0.log" Feb 26 20:56:56 crc kubenswrapper[4722]: I0226 20:56:56.590705 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx_77f3d316-1f72-4a5a-b730-7f8dab299ca8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:56 crc kubenswrapper[4722]: I0226 20:56:56.691332 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-ps5f2_ae283069-3ec3-4960-b66a-b830709cb1ee/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:57 crc kubenswrapper[4722]: I0226 20:56:57.021774 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6e07189c-f69a-4914-8fe7-efbdcf3c5882/kube-state-metrics/0.log" Feb 26 20:56:57 crc kubenswrapper[4722]: I0226 20:56:57.158493 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq_32f8d32f-af41-44a8-a252-50bdabeeab06/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:57 crc kubenswrapper[4722]: I0226 20:56:57.236926 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7db9cf967f-jqqzk_783243ef-530a-418a-98b7-9f781077e95a/keystone-api/0.log" Feb 26 20:56:57 crc kubenswrapper[4722]: I0226 20:56:57.612615 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b6f7bc47c-7t9k4_d3b8803c-74dc-4932-9bdc-d45ca70103c4/neutron-httpd/0.log" Feb 26 20:56:57 crc kubenswrapper[4722]: I0226 20:56:57.676025 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b6f7bc47c-7t9k4_d3b8803c-74dc-4932-9bdc-d45ca70103c4/neutron-api/0.log" Feb 26 20:56:57 crc kubenswrapper[4722]: I0226 20:56:57.816681 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2_a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:58 crc kubenswrapper[4722]: I0226 20:56:58.145588 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:56:58 crc kubenswrapper[4722]: E0226 20:56:58.145899 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:56:58 crc kubenswrapper[4722]: I0226 20:56:58.441121 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f9ddeffe-fdc8-4671-9197-da3818ccdfb1/nova-api-log/0.log" Feb 26 20:56:58 crc kubenswrapper[4722]: I0226 20:56:58.564305 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_94a25c7f-6346-4ce4-ba05-130047eee9b5/nova-cell0-conductor-conductor/0.log" Feb 26 20:56:58 crc kubenswrapper[4722]: I0226 20:56:58.673579 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f9ddeffe-fdc8-4671-9197-da3818ccdfb1/nova-api-api/0.log" Feb 26 20:56:59 crc kubenswrapper[4722]: I0226 20:56:59.147433 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_24c715e3-32ab-4d06-b3d3-4ce8281bb54b/nova-cell1-conductor-conductor/0.log" Feb 26 20:56:59 crc kubenswrapper[4722]: I0226 20:56:59.222100 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ea10c214-f090-4ada-b1dd-ec1e9a153fb1/nova-cell1-novncproxy-novncproxy/0.log" Feb 26 20:56:59 crc kubenswrapper[4722]: I0226 20:56:59.483914 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gfgw9_6d48f7c6-d170-4dea-9214-5324870b8311/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:59 crc kubenswrapper[4722]: I0226 20:56:59.747133 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b5d6d9cc-9697-46cc-ab38-7879ef449ab3/nova-metadata-log/0.log" Feb 26 20:57:00 crc kubenswrapper[4722]: I0226 20:57:00.162407 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_37cb4b4d-ebfb-4070-b002-a20ec25dce18/nova-scheduler-scheduler/0.log" Feb 26 20:57:00 crc kubenswrapper[4722]: I0226 20:57:00.226057 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_12264086-b848-4375-9787-a2ff33b411f0/mysql-bootstrap/0.log" Feb 26 20:57:00 crc kubenswrapper[4722]: I0226 20:57:00.519672 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_12264086-b848-4375-9787-a2ff33b411f0/mysql-bootstrap/0.log" Feb 26 20:57:00 crc kubenswrapper[4722]: I0226 20:57:00.555307 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_12264086-b848-4375-9787-a2ff33b411f0/galera/0.log" Feb 26 20:57:00 crc kubenswrapper[4722]: I0226 20:57:00.783405 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ffecd786-4ba4-4d40-9b0a-aa0af47577ad/mysql-bootstrap/0.log" Feb 26 20:57:00 crc kubenswrapper[4722]: I0226 20:57:00.946445 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ffecd786-4ba4-4d40-9b0a-aa0af47577ad/mysql-bootstrap/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:00.999965 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ffecd786-4ba4-4d40-9b0a-aa0af47577ad/galera/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.255770 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d/openstackclient/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.368112 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b5d6d9cc-9697-46cc-ab38-7879ef449ab3/nova-metadata-metadata/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.483496 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nfkn8_721ad050-b6a8-432b-89b0-226c0efa6222/openstack-network-exporter/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.691514 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7h8c_ba0fada1-7131-401e-adf3-f9e05d1bd949/ovsdb-server-init/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.850876 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7h8c_ba0fada1-7131-401e-adf3-f9e05d1bd949/ovsdb-server-init/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.876939 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7h8c_ba0fada1-7131-401e-adf3-f9e05d1bd949/ovs-vswitchd/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.922669 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7h8c_ba0fada1-7131-401e-adf3-f9e05d1bd949/ovsdb-server/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.104741 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rsgbx_5c9c23c8-6fed-49f5-abe1-d44b885952ec/ovn-controller/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.412770 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-28p8r_a0266eb0-8a26-4701-9014-93e0f03724ab/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.427758 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c64118dc-ed6e-478a-9c59-d7e24212daba/openstack-network-exporter/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.613006 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c64118dc-ed6e-478a-9c59-d7e24212daba/ovn-northd/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.661421 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4601fbad-d1bf-4205-86c5-a392e381300e/openstack-network-exporter/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.863303 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4601fbad-d1bf-4205-86c5-a392e381300e/ovsdbserver-nb/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.897841 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe/openstack-network-exporter/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.120750 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe/ovsdbserver-sb/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.251835 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-866c89845b-gpgsw_fee2bbcc-fdd9-440d-8f6f-66206142c2f8/placement-api/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.436084 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-866c89845b-gpgsw_fee2bbcc-fdd9-440d-8f6f-66206142c2f8/placement-log/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.494156 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_751959d7-d249-457b-896e-fbc800f4d2bf/init-config-reloader/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.739744 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_751959d7-d249-457b-896e-fbc800f4d2bf/init-config-reloader/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.774356 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_751959d7-d249-457b-896e-fbc800f4d2bf/prometheus/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.824851 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_751959d7-d249-457b-896e-fbc800f4d2bf/config-reloader/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.021724 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_751959d7-d249-457b-896e-fbc800f4d2bf/thanos-sidecar/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.049905 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e3bb51c2-ceca-4301-82cb-959028030d58/setup-container/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.277638 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e3bb51c2-ceca-4301-82cb-959028030d58/setup-container/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.321235 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e3bb51c2-ceca-4301-82cb-959028030d58/rabbitmq/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.500716 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_796c5930-3ba4-4795-88f0-2e85145f3c85/setup-container/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.735619 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_796c5930-3ba4-4795-88f0-2e85145f3c85/setup-container/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.775583 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_796c5930-3ba4-4795-88f0-2e85145f3c85/rabbitmq/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.000241 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f_4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.060105 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-65knh_5a0a077a-aebd-490b-b110-bc7927910d4a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.228786 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq_a1a3db58-368f-4ea3-a807-ddd7c58435f5/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.445157 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-6lnh4_1f7a8d95-7d72-427d-8bd1-f0ec3e512458/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.601962 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-r8rtz_5b58da6a-b54c-41f9-a1fc-49021ec39a2c/ssh-known-hosts-edpm-deployment/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.873104 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b495fbf79-442st_d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17/proxy-server/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.909922 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b495fbf79-442st_d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17/proxy-httpd/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.055850 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vfmbj_be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21/swift-ring-rebalance/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.209099 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/account-auditor/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.307305 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/account-reaper/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.430375 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/account-server/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.472594 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/account-replicator/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.565471 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/container-auditor/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.656303 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/container-replicator/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.694983 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/container-server/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.862863 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/container-updater/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.964280 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/object-expirer/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.978511 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/object-auditor/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.310038 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/object-replicator/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.341022 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_73cc9447-4501-43ec-9f4a-2e406341ee16/cloudkitty-proc/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.391114 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/object-server/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.425348 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/object-updater/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.510743 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/swift-recon-cron/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.605290 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/rsync/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.799158 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4htcq_da1f8648-e221-4b8e-8691-5e88fc460998/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.890776 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_48c7de81-f528-48d3-bb95-99a9cf36f43f/tempest-tests-tempest-tests-runner/0.log" Feb 26 20:57:08 crc kubenswrapper[4722]: I0226 20:57:08.034924 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e14dbf76-7427-43f1-a3b5-e94661bab656/test-operator-logs-container/0.log" Feb 26 20:57:08 crc kubenswrapper[4722]: I0226 20:57:08.170081 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-td2t2_37b9e07c-5396-48b5-a8cb-6eab31621fc8/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:12 crc kubenswrapper[4722]: I0226 20:57:12.146229 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:57:12 crc kubenswrapper[4722]: E0226 20:57:12.147036 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:57:13 crc kubenswrapper[4722]: I0226 20:57:13.860448 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0a4edaeb-4029-4586-ab06-d09489d2e944/memcached/0.log" Feb 26 20:57:25 crc kubenswrapper[4722]: I0226 20:57:25.146748 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:57:25 crc kubenswrapper[4722]: E0226 20:57:25.147448 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.428627 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/util/0.log" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.639363 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/pull/0.log" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.663752 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/util/0.log" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.680487 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/pull/0.log" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.834296 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/util/0.log" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.850689 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/pull/0.log" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.875288 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/extract/0.log" Feb 26 20:57:37 crc kubenswrapper[4722]: I0226 20:57:37.294263 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-ngk6x_f6b9ed59-4089-4a80-bdae-368d169363f2/manager/0.log" Feb 26 20:57:37 crc kubenswrapper[4722]: I0226 20:57:37.636048 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-nrssm_a2804dbe-f9c5-4aca-b3f5-6392d2bc20db/manager/0.log" Feb 26 20:57:37 crc kubenswrapper[4722]: I0226 20:57:37.816905 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-hw5f9_604550ce-766e-48bb-a0a7-d14b7708a44e/manager/0.log" Feb 26 20:57:38 crc kubenswrapper[4722]: I0226 20:57:38.043624 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-dzzdm_109ec0d2-04bf-4476-b14c-51249361da38/manager/0.log" Feb 26 20:57:38 crc kubenswrapper[4722]: I0226 20:57:38.568063 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-56c7w_a21a637b-e5c6-47ab-a41e-9622452be17e/manager/0.log" Feb 26 20:57:38 crc kubenswrapper[4722]: I0226 20:57:38.667760 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-jmhxt_c59c3e1b-9d18-45eb-a409-bd2176527063/manager/0.log" Feb 26 20:57:38 crc kubenswrapper[4722]: I0226 20:57:38.718589 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-dhv4g_2a379e8a-c5df-465e-8b23-6b9ee6c874f9/manager/0.log" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.078788 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-v5zlv_e42d4e0f-1071-4cb4-b9ff-90d02236a1a2/manager/0.log" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.101939 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-mxqjv_b96ea9ca-8ca1-41aa-af25-a184c79bf18f/manager/0.log" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.145626 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:57:39 crc kubenswrapper[4722]: E0226 20:57:39.145943 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.378107 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-6sm8h_873eb62b-74db-41cc-8249-3578cf2f59b4/manager/0.log" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.556931 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-rlcpj_371eef1d-3e55-48bb-8b14-f2c36fbc5689/manager/0.log" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.834443 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-qjxzz_6bc05a1e-4ace-47bc-af66-42c44dc19b80/manager/0.log" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.931831 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-tm8j8_5d4b2367-21d7-4be2-a83b-1932bd988df5/manager/0.log" Feb 26 20:57:40 crc kubenswrapper[4722]: I0226 20:57:40.354637 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc_7e0beaae-8f5c-4504-9d2a-1b32980e4f37/manager/0.log" Feb 26 20:57:40 crc kubenswrapper[4722]: I0226 20:57:40.929092 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-f7qpg_73eb4662-b5c2-4bad-a2ee-6bfbe704e239/registry-server/0.log" Feb 26 20:57:40 crc kubenswrapper[4722]: I0226 20:57:40.937448 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5bd4858f4d-4spcc_47a13091-6ef0-488e-98aa-beb72bc48ce6/operator/0.log" Feb 26 20:57:41 crc kubenswrapper[4722]: I0226 20:57:41.221833 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-c5544_710dce51-9c0f-4b66-9f5e-39cfe744f275/manager/0.log" Feb 26 20:57:41 crc kubenswrapper[4722]: I0226 20:57:41.419732 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-mrjvd_2efbc411-9d10-4261-952f-5b97cbdc9e48/manager/0.log" Feb 26 20:57:41 crc kubenswrapper[4722]: I0226 20:57:41.535404 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-hhp7x_50694186-e31c-499d-ba48-e5818eeceee5/operator/0.log" Feb 26 20:57:41 crc kubenswrapper[4722]: I0226 20:57:41.838103 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-pwtl7_4b98eee6-c514-4ca3-8544-a6978b6ed230/manager/0.log" Feb 26 20:57:42 crc kubenswrapper[4722]: I0226 20:57:42.078207 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-lrk22_532a7206-b336-4471-b9ad-c009c9395015/manager/0.log" Feb 26 20:57:42 crc kubenswrapper[4722]: I0226 20:57:42.304611 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-vqjv6_c3c3e040-3df2-4b02-9d09-a76bcc90b882/manager/0.log" Feb 26 20:57:42 crc kubenswrapper[4722]: I0226 20:57:42.670948 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-85bcd67d77-fkpjs_2bcd6197-b9a9-4330-a25f-aab80685aa27/manager/0.log" Feb 26 20:57:43 crc kubenswrapper[4722]: I0226 20:57:43.063918 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-58b9cb6558-sph4f_c7d97484-b285-458e-94f4-3bd8700a25d7/manager/0.log" Feb 26 20:57:45 crc kubenswrapper[4722]: I0226 20:57:45.077980 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-gh42q_71fdb02f-7fa5-4151-bec9-7e7d3ac072dd/manager/0.log" Feb 26 20:57:53 crc kubenswrapper[4722]: I0226 20:57:53.145984 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:57:53 crc kubenswrapper[4722]: E0226 20:57:53.146793 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.139857 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535658-4gwfp"] Feb 26 20:58:00 crc kubenswrapper[4722]: E0226 20:58:00.143152 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0a8c07-aefe-49d6-a4bc-3eab73cad424" containerName="container-00" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.143187 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0a8c07-aefe-49d6-a4bc-3eab73cad424" containerName="container-00" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.143425 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0a8c07-aefe-49d6-a4bc-3eab73cad424" containerName="container-00" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.144548 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.150536 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.150585 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.150676 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.158121 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535658-4gwfp"] Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.242067 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbg2\" (UniqueName: \"kubernetes.io/projected/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923-kube-api-access-jdbg2\") pod \"auto-csr-approver-29535658-4gwfp\" (UID: \"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923\") " pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.344324 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbg2\" (UniqueName: \"kubernetes.io/projected/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923-kube-api-access-jdbg2\") pod \"auto-csr-approver-29535658-4gwfp\" (UID: \"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923\") " pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.368205 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbg2\" (UniqueName: \"kubernetes.io/projected/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923-kube-api-access-jdbg2\") pod \"auto-csr-approver-29535658-4gwfp\" (UID: \"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923\") " pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.469461 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.928540 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535658-4gwfp"] Feb 26 20:58:01 crc kubenswrapper[4722]: I0226 20:58:01.911794 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" event={"ID":"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923","Type":"ContainerStarted","Data":"76ac911f3b7d301e9aecc1ae4ff90fb2e8f2b52874a405854f94b3c25eade257"} Feb 26 20:58:02 crc kubenswrapper[4722]: I0226 20:58:02.318433 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dfrb6_15c05814-e318-455c-83f7-40698b29a44d/control-plane-machine-set-operator/0.log" Feb 26 20:58:02 crc kubenswrapper[4722]: I0226 20:58:02.491302 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bzbtt_8bd819da-de96-4dc4-a893-2ae7b1be33b2/kube-rbac-proxy/0.log" Feb 26 20:58:02 crc kubenswrapper[4722]: I0226 20:58:02.541436 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bzbtt_8bd819da-de96-4dc4-a893-2ae7b1be33b2/machine-api-operator/0.log" Feb 26 20:58:02 crc kubenswrapper[4722]: I0226 20:58:02.922200 4722 generic.go:334] "Generic (PLEG): container finished" podID="d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923" containerID="32347a701ba48be3c77ad3fab882caef4ee3129a888dfa7eb7f09f79ccbff2e8" exitCode=0 Feb 26 20:58:02 crc kubenswrapper[4722]: I0226 20:58:02.922282 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" event={"ID":"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923","Type":"ContainerDied","Data":"32347a701ba48be3c77ad3fab882caef4ee3129a888dfa7eb7f09f79ccbff2e8"} Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.481998 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.533354 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdbg2\" (UniqueName: \"kubernetes.io/projected/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923-kube-api-access-jdbg2\") pod \"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923\" (UID: \"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923\") " Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.542205 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923-kube-api-access-jdbg2" (OuterVolumeSpecName: "kube-api-access-jdbg2") pod "d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923" (UID: "d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923"). InnerVolumeSpecName "kube-api-access-jdbg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.635447 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdbg2\" (UniqueName: \"kubernetes.io/projected/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923-kube-api-access-jdbg2\") on node \"crc\" DevicePath \"\"" Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.943904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" event={"ID":"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923","Type":"ContainerDied","Data":"76ac911f3b7d301e9aecc1ae4ff90fb2e8f2b52874a405854f94b3c25eade257"} Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.943944 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76ac911f3b7d301e9aecc1ae4ff90fb2e8f2b52874a405854f94b3c25eade257" Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.943999 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:05 crc kubenswrapper[4722]: I0226 20:58:05.549379 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535652-wbsbh"] Feb 26 20:58:05 crc kubenswrapper[4722]: I0226 20:58:05.558849 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535652-wbsbh"] Feb 26 20:58:06 crc kubenswrapper[4722]: I0226 20:58:06.165089 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5af60092-6c8e-4807-a060-3e9e7276ac0c" path="/var/lib/kubelet/pods/5af60092-6c8e-4807-a060-3e9e7276ac0c/volumes" Feb 26 20:58:08 crc kubenswrapper[4722]: I0226 20:58:08.153245 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:58:08 crc kubenswrapper[4722]: E0226 20:58:08.153890 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:58:18 crc kubenswrapper[4722]: I0226 20:58:18.745242 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-9d76n_d66ba312-de97-438e-a172-5bcd2b6ef4db/cert-manager-controller/0.log" Feb 26 20:58:18 crc kubenswrapper[4722]: I0226 20:58:18.926931 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-frp6h_c966e2d5-2260-4d2f-ab59-4658284e872d/cert-manager-cainjector/0.log" Feb 26 20:58:18 crc kubenswrapper[4722]: I0226 20:58:18.973074 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-45hpn_4b627d55-dcd7-42c6-948f-a50f17bc7688/cert-manager-webhook/0.log" Feb 26 20:58:22 crc kubenswrapper[4722]: I0226 20:58:22.146529 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:58:22 crc kubenswrapper[4722]: E0226 20:58:22.147366 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:58:29 crc kubenswrapper[4722]: I0226 20:58:29.555910 4722 scope.go:117] "RemoveContainer" containerID="093bce09c87eb1fcd55ab51cfd2246ca3f13ef5535e5d0505ae1d3112c4f1a0c" Feb 26 20:58:33 crc kubenswrapper[4722]: I0226 20:58:33.145415 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:58:33 crc kubenswrapper[4722]: E0226 20:58:33.146243 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:58:34 crc kubenswrapper[4722]: I0226 20:58:34.640070 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-m7dz9_fae3dc9f-133c-42a5-82ef-23750fb2ffec/nmstate-handler/0.log" Feb 26 20:58:34 crc kubenswrapper[4722]: I0226 20:58:34.662977 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-6gtm5_29b96d96-cf6b-46a4-89c5-4a9e1b2669c7/nmstate-console-plugin/0.log" Feb 26 20:58:34 crc kubenswrapper[4722]: I0226 20:58:34.823862 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-w2rfd_65a85ed5-3f32-48e8-95b3-4576eb4ae0ea/kube-rbac-proxy/0.log" Feb 26 20:58:34 crc kubenswrapper[4722]: I0226 20:58:34.899833 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-w2rfd_65a85ed5-3f32-48e8-95b3-4576eb4ae0ea/nmstate-metrics/0.log" Feb 26 20:58:35 crc kubenswrapper[4722]: I0226 20:58:35.050366 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-lpl8c_a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb/nmstate-operator/0.log" Feb 26 20:58:35 crc kubenswrapper[4722]: I0226 20:58:35.197369 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-fqbwr_92200730-c944-47cc-bed8-8f8f7ac84819/nmstate-webhook/0.log" Feb 26 20:58:47 crc kubenswrapper[4722]: I0226 20:58:47.146339 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:58:47 crc kubenswrapper[4722]: E0226 20:58:47.147081 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:58:49 crc kubenswrapper[4722]: I0226 20:58:49.104114 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7855955448-bgsw2_9c8f8fbe-13f7-474d-99bb-542e8ab3d93e/kube-rbac-proxy/0.log" Feb 26 20:58:49 crc kubenswrapper[4722]: I0226 20:58:49.169771 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7855955448-bgsw2_9c8f8fbe-13f7-474d-99bb-542e8ab3d93e/manager/0.log" Feb 26 20:59:02 crc kubenswrapper[4722]: I0226 20:59:02.146735 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:59:02 crc kubenswrapper[4722]: E0226 20:59:02.147609 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:59:03 crc kubenswrapper[4722]: I0226 20:59:03.210483 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2rgq4_edddb923-4396-43c9-880a-ed3ac0215808/prometheus-operator/0.log" Feb 26 20:59:03 crc kubenswrapper[4722]: I0226 20:59:03.335862 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_dde658b6-956e-4b8c-86b6-e707bfcc0dbf/prometheus-operator-admission-webhook/0.log" Feb 26 20:59:03 crc kubenswrapper[4722]: I0226 20:59:03.412306 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_419eee0b-c988-42e3-af4f-cef110425bb3/prometheus-operator-admission-webhook/0.log" Feb 26 20:59:03 crc kubenswrapper[4722]: I0226 20:59:03.527312 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-bmtvj_b61de85a-5167-4af3-b14b-993cb20559fa/operator/0.log" Feb 26 20:59:03 crc kubenswrapper[4722]: I0226 20:59:03.611795 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tf59s_c5420a13-8c3b-45fa-9c99-a796202b11d9/perses-operator/0.log" Feb 26 20:59:15 crc kubenswrapper[4722]: I0226 20:59:15.146915 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:59:15 crc kubenswrapper[4722]: E0226 20:59:15.147899 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.031707 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-gpj96_80c4aae3-6c63-43f6-8dcb-46e953562c67/kube-rbac-proxy/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.222923 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-gpj96_80c4aae3-6c63-43f6-8dcb-46e953562c67/controller/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.467832 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-frr-files/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.684649 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-reloader/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.699619 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-metrics/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.722282 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-frr-files/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.723804 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-reloader/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.901014 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-frr-files/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.907419 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-metrics/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.907922 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-reloader/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.959549 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-metrics/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.191681 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-frr-files/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.192786 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-reloader/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.231533 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-metrics/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.251359 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/controller/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.422057 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/frr-metrics/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.442440 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/kube-rbac-proxy-frr/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.476832 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/kube-rbac-proxy/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.801872 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/reloader/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.866259 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-s8rl7_0ee913a7-6a3f-46e5-99f8-d405722ef55e/frr-k8s-webhook-server/0.log" Feb 26 20:59:21 crc kubenswrapper[4722]: I0226 20:59:21.055803 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-ccc6bdbb5-xpd7z_52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d/manager/0.log" Feb 26 20:59:21 crc kubenswrapper[4722]: I0226 20:59:21.232878 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-65586c54c8-bwxhb_0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b/webhook-server/0.log" Feb 26 20:59:21 crc kubenswrapper[4722]: I0226 20:59:21.318482 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q9jh2_de675145-f60b-4c0c-b5c9-ef0b33e10c29/kube-rbac-proxy/0.log" Feb 26 20:59:21 crc kubenswrapper[4722]: I0226 20:59:21.879730 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q9jh2_de675145-f60b-4c0c-b5c9-ef0b33e10c29/speaker/0.log" Feb 26 20:59:21 crc kubenswrapper[4722]: I0226 20:59:21.950898 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/frr/0.log" Feb 26 20:59:30 crc kubenswrapper[4722]: I0226 20:59:30.149242 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:59:30 crc kubenswrapper[4722]: E0226 20:59:30.150108 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.070275 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/util/0.log" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.556889 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/pull/0.log" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.597424 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/pull/0.log" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.648287 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/util/0.log" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.760607 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/util/0.log" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.831575 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/pull/0.log" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.869359 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/extract/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.361191 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/util/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.605466 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/pull/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.614613 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/pull/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.632312 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/util/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.855844 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/extract/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.879827 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/pull/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.903218 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/util/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.039106 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/util/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.223352 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/util/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.242335 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/pull/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.259217 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/pull/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.449309 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/pull/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.449875 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/extract/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.495479 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/util/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.101997 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/extract-utilities/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.302660 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/extract-utilities/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.339290 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/extract-content/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.341205 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/extract-content/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.508154 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/extract-utilities/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.537376 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/extract-content/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.696539 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/registry-server/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.067919 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/extract-utilities/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.237485 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/extract-content/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.239024 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/extract-content/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.243725 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/extract-utilities/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.447957 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/extract-utilities/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.518895 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/extract-content/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.622196 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/util/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.918230 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/pull/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.918574 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/pull/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.000066 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/util/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.099924 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/registry-server/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.193468 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/pull/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.239321 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/util/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.284263 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/extract/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.361163 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-n4nc7_35655c90-2927-4858-a067-3e520498cd26/marketplace-operator/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.451629 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/extract-utilities/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.567660 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/extract-utilities/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.585247 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/extract-content/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.615912 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/extract-content/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.798250 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/extract-utilities/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.804709 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/extract-content/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.834801 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/extract-utilities/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.962471 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/registry-server/0.log" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.092510 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/extract-utilities/0.log" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.145583 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:59:42 crc kubenswrapper[4722]: E0226 20:59:42.145825 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.153648 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/extract-content/0.log" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.158633 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/extract-content/0.log" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.376840 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/extract-utilities/0.log" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.400622 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/extract-content/0.log" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.768691 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/registry-server/0.log" Feb 26 20:59:55 crc kubenswrapper[4722]: I0226 20:59:55.243404 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2rgq4_edddb923-4396-43c9-880a-ed3ac0215808/prometheus-operator/0.log" Feb 26 20:59:55 crc kubenswrapper[4722]: I0226 20:59:55.257926 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_dde658b6-956e-4b8c-86b6-e707bfcc0dbf/prometheus-operator-admission-webhook/0.log" Feb 26 20:59:55 crc kubenswrapper[4722]: I0226 20:59:55.334622 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_419eee0b-c988-42e3-af4f-cef110425bb3/prometheus-operator-admission-webhook/0.log" Feb 26 20:59:55 crc kubenswrapper[4722]: I0226 20:59:55.486218 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-bmtvj_b61de85a-5167-4af3-b14b-993cb20559fa/operator/0.log" Feb 26 20:59:55 crc kubenswrapper[4722]: I0226 20:59:55.513677 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tf59s_c5420a13-8c3b-45fa-9c99-a796202b11d9/perses-operator/0.log" Feb 26 20:59:56 crc kubenswrapper[4722]: I0226 20:59:56.167397 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:59:57 crc kubenswrapper[4722]: I0226 20:59:57.056578 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"16ec12d6b5bec63a6526ad9b6c9c476723f1f33b7f2af892b8071e40154eee61"} Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.218265 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535660-jljv7"] Feb 26 21:00:00 crc kubenswrapper[4722]: E0226 21:00:00.219692 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923" containerName="oc" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.219708 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923" containerName="oc" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.220063 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923" containerName="oc" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.221001 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9"] Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.221577 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.222553 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535660-jljv7"] Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.222581 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9"] Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.222665 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.224934 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.226442 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.226851 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.226858 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.225297 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.326630 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f8c5676-a056-46a3-8b7d-d74804688463-secret-volume\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.327071 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vg2c\" (UniqueName: \"kubernetes.io/projected/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2-kube-api-access-4vg2c\") pod \"auto-csr-approver-29535660-jljv7\" (UID: \"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2\") " pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.327101 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8ns\" (UniqueName: \"kubernetes.io/projected/9f8c5676-a056-46a3-8b7d-d74804688463-kube-api-access-xl8ns\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.327176 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f8c5676-a056-46a3-8b7d-d74804688463-config-volume\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.429349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f8c5676-a056-46a3-8b7d-d74804688463-secret-volume\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.429484 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vg2c\" (UniqueName: \"kubernetes.io/projected/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2-kube-api-access-4vg2c\") pod \"auto-csr-approver-29535660-jljv7\" (UID: \"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2\") " pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.429517 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl8ns\" (UniqueName: \"kubernetes.io/projected/9f8c5676-a056-46a3-8b7d-d74804688463-kube-api-access-xl8ns\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.429588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f8c5676-a056-46a3-8b7d-d74804688463-config-volume\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.430840 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f8c5676-a056-46a3-8b7d-d74804688463-config-volume\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.444133 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f8c5676-a056-46a3-8b7d-d74804688463-secret-volume\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.447959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl8ns\" (UniqueName: \"kubernetes.io/projected/9f8c5676-a056-46a3-8b7d-d74804688463-kube-api-access-xl8ns\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.455850 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vg2c\" (UniqueName: \"kubernetes.io/projected/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2-kube-api-access-4vg2c\") pod \"auto-csr-approver-29535660-jljv7\" (UID: \"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2\") " pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.566888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.580263 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:01 crc kubenswrapper[4722]: I0226 21:00:01.065661 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535660-jljv7"] Feb 26 21:00:01 crc kubenswrapper[4722]: I0226 21:00:01.087411 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 21:00:01 crc kubenswrapper[4722]: I0226 21:00:01.096195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535660-jljv7" event={"ID":"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2","Type":"ContainerStarted","Data":"a27d0097394fbd8a1365352f9ef44f389c475b21b8573af800e9e4bfc5d8f3c9"} Feb 26 21:00:01 crc kubenswrapper[4722]: I0226 21:00:01.316819 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9"] Feb 26 21:00:01 crc kubenswrapper[4722]: W0226 21:00:01.317004 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f8c5676_a056_46a3_8b7d_d74804688463.slice/crio-df4444d00b84939970136ab17eef5f56910a8eee7be40b560ff817a7b6679c19 WatchSource:0}: Error finding container df4444d00b84939970136ab17eef5f56910a8eee7be40b560ff817a7b6679c19: Status 404 returned error can't find the container with id df4444d00b84939970136ab17eef5f56910a8eee7be40b560ff817a7b6679c19 Feb 26 21:00:02 crc kubenswrapper[4722]: I0226 21:00:02.107966 4722 generic.go:334] "Generic (PLEG): container finished" podID="9f8c5676-a056-46a3-8b7d-d74804688463" containerID="e9b669ecf9a6dd7eb33eca423be013c2860c09df4984d89731a36e9e263f3fa5" exitCode=0 Feb 26 21:00:02 crc kubenswrapper[4722]: I0226 21:00:02.108071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" event={"ID":"9f8c5676-a056-46a3-8b7d-d74804688463","Type":"ContainerDied","Data":"e9b669ecf9a6dd7eb33eca423be013c2860c09df4984d89731a36e9e263f3fa5"} Feb 26 21:00:02 crc kubenswrapper[4722]: I0226 21:00:02.108496 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" event={"ID":"9f8c5676-a056-46a3-8b7d-d74804688463","Type":"ContainerStarted","Data":"df4444d00b84939970136ab17eef5f56910a8eee7be40b560ff817a7b6679c19"} Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.688715 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.795395 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl8ns\" (UniqueName: \"kubernetes.io/projected/9f8c5676-a056-46a3-8b7d-d74804688463-kube-api-access-xl8ns\") pod \"9f8c5676-a056-46a3-8b7d-d74804688463\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.795479 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f8c5676-a056-46a3-8b7d-d74804688463-config-volume\") pod \"9f8c5676-a056-46a3-8b7d-d74804688463\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.795527 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f8c5676-a056-46a3-8b7d-d74804688463-secret-volume\") pod \"9f8c5676-a056-46a3-8b7d-d74804688463\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.796908 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f8c5676-a056-46a3-8b7d-d74804688463-config-volume" (OuterVolumeSpecName: "config-volume") pod "9f8c5676-a056-46a3-8b7d-d74804688463" (UID: "9f8c5676-a056-46a3-8b7d-d74804688463"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.802446 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8c5676-a056-46a3-8b7d-d74804688463-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9f8c5676-a056-46a3-8b7d-d74804688463" (UID: "9f8c5676-a056-46a3-8b7d-d74804688463"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.822276 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8c5676-a056-46a3-8b7d-d74804688463-kube-api-access-xl8ns" (OuterVolumeSpecName: "kube-api-access-xl8ns") pod "9f8c5676-a056-46a3-8b7d-d74804688463" (UID: "9f8c5676-a056-46a3-8b7d-d74804688463"). InnerVolumeSpecName "kube-api-access-xl8ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.897921 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl8ns\" (UniqueName: \"kubernetes.io/projected/9f8c5676-a056-46a3-8b7d-d74804688463-kube-api-access-xl8ns\") on node \"crc\" DevicePath \"\"" Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.898263 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f8c5676-a056-46a3-8b7d-d74804688463-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.898273 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f8c5676-a056-46a3-8b7d-d74804688463-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 21:00:04 crc kubenswrapper[4722]: I0226 21:00:04.129107 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" event={"ID":"9f8c5676-a056-46a3-8b7d-d74804688463","Type":"ContainerDied","Data":"df4444d00b84939970136ab17eef5f56910a8eee7be40b560ff817a7b6679c19"} Feb 26 21:00:04 crc kubenswrapper[4722]: I0226 21:00:04.129168 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df4444d00b84939970136ab17eef5f56910a8eee7be40b560ff817a7b6679c19" Feb 26 21:00:04 crc kubenswrapper[4722]: I0226 21:00:04.129224 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:04 crc kubenswrapper[4722]: I0226 21:00:04.781580 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk"] Feb 26 21:00:04 crc kubenswrapper[4722]: I0226 21:00:04.791715 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk"] Feb 26 21:00:05 crc kubenswrapper[4722]: I0226 21:00:05.139170 4722 generic.go:334] "Generic (PLEG): container finished" podID="d5e74dca-b32b-4f09-8ceb-f66d79bac2f2" containerID="86037d9ba687a6cdd75df949c40534910d64ca12216f7d1464856810a7c3619c" exitCode=0 Feb 26 21:00:05 crc kubenswrapper[4722]: I0226 21:00:05.139234 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535660-jljv7" event={"ID":"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2","Type":"ContainerDied","Data":"86037d9ba687a6cdd75df949c40534910d64ca12216f7d1464856810a7c3619c"} Feb 26 21:00:06 crc kubenswrapper[4722]: I0226 21:00:06.168113 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c234a00-8cb1-4bfb-906d-05e2d12f8222" path="/var/lib/kubelet/pods/0c234a00-8cb1-4bfb-906d-05e2d12f8222/volumes" Feb 26 21:00:06 crc kubenswrapper[4722]: I0226 21:00:06.762206 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:06 crc kubenswrapper[4722]: I0226 21:00:06.863699 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vg2c\" (UniqueName: \"kubernetes.io/projected/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2-kube-api-access-4vg2c\") pod \"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2\" (UID: \"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2\") " Feb 26 21:00:06 crc kubenswrapper[4722]: I0226 21:00:06.872233 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2-kube-api-access-4vg2c" (OuterVolumeSpecName: "kube-api-access-4vg2c") pod "d5e74dca-b32b-4f09-8ceb-f66d79bac2f2" (UID: "d5e74dca-b32b-4f09-8ceb-f66d79bac2f2"). InnerVolumeSpecName "kube-api-access-4vg2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:00:06 crc kubenswrapper[4722]: I0226 21:00:06.966691 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vg2c\" (UniqueName: \"kubernetes.io/projected/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2-kube-api-access-4vg2c\") on node \"crc\" DevicePath \"\"" Feb 26 21:00:07 crc kubenswrapper[4722]: I0226 21:00:07.158975 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535660-jljv7" event={"ID":"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2","Type":"ContainerDied","Data":"a27d0097394fbd8a1365352f9ef44f389c475b21b8573af800e9e4bfc5d8f3c9"} Feb 26 21:00:07 crc kubenswrapper[4722]: I0226 21:00:07.159013 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a27d0097394fbd8a1365352f9ef44f389c475b21b8573af800e9e4bfc5d8f3c9" Feb 26 21:00:07 crc kubenswrapper[4722]: I0226 21:00:07.159021 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:07 crc kubenswrapper[4722]: I0226 21:00:07.833498 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535654-lbd76"] Feb 26 21:00:07 crc kubenswrapper[4722]: I0226 21:00:07.849226 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535654-lbd76"] Feb 26 21:00:08 crc kubenswrapper[4722]: I0226 21:00:08.158905 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a0c50e-716b-4b9a-9a95-955e01050f2b" path="/var/lib/kubelet/pods/15a0c50e-716b-4b9a-9a95-955e01050f2b/volumes" Feb 26 21:00:08 crc kubenswrapper[4722]: I0226 21:00:08.793376 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7855955448-bgsw2_9c8f8fbe-13f7-474d-99bb-542e8ab3d93e/kube-rbac-proxy/0.log" Feb 26 21:00:08 crc kubenswrapper[4722]: I0226 21:00:08.842365 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7855955448-bgsw2_9c8f8fbe-13f7-474d-99bb-542e8ab3d93e/manager/0.log" Feb 26 21:00:29 crc kubenswrapper[4722]: I0226 21:00:29.643228 4722 scope.go:117] "RemoveContainer" containerID="7af011a7c447aa639bf21f7108e4308a96e92ebeb95c177a6c0f3dcbc7e49422" Feb 26 21:00:29 crc kubenswrapper[4722]: I0226 21:00:29.667434 4722 scope.go:117] "RemoveContainer" containerID="3b778151619cea3780873ac1d65406d6af5a1408cd8d0231ccc9ebb2e8538352" Feb 26 21:00:43 crc kubenswrapper[4722]: I0226 21:00:43.984950 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nc4nh"] Feb 26 21:00:43 crc kubenswrapper[4722]: E0226 21:00:43.987623 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e74dca-b32b-4f09-8ceb-f66d79bac2f2" containerName="oc" Feb 26 21:00:43 crc kubenswrapper[4722]: I0226 21:00:43.987870 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e74dca-b32b-4f09-8ceb-f66d79bac2f2" containerName="oc" Feb 26 21:00:43 crc kubenswrapper[4722]: E0226 21:00:43.988006 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8c5676-a056-46a3-8b7d-d74804688463" containerName="collect-profiles" Feb 26 21:00:43 crc kubenswrapper[4722]: I0226 21:00:43.988132 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8c5676-a056-46a3-8b7d-d74804688463" containerName="collect-profiles" Feb 26 21:00:43 crc kubenswrapper[4722]: I0226 21:00:43.988664 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f8c5676-a056-46a3-8b7d-d74804688463" containerName="collect-profiles" Feb 26 21:00:43 crc kubenswrapper[4722]: I0226 21:00:43.988814 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e74dca-b32b-4f09-8ceb-f66d79bac2f2" containerName="oc" Feb 26 21:00:43 crc kubenswrapper[4722]: I0226 21:00:43.991371 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.004317 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc4nh"] Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.009592 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-catalog-content\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.009858 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-utilities\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.010046 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pcpz\" (UniqueName: \"kubernetes.io/projected/eb4728ba-33a7-4284-9504-99dc3457b511-kube-api-access-7pcpz\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.112472 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-catalog-content\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.112630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-utilities\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.112665 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pcpz\" (UniqueName: \"kubernetes.io/projected/eb4728ba-33a7-4284-9504-99dc3457b511-kube-api-access-7pcpz\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.113543 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-catalog-content\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.114040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-utilities\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.132702 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pcpz\" (UniqueName: \"kubernetes.io/projected/eb4728ba-33a7-4284-9504-99dc3457b511-kube-api-access-7pcpz\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.331450 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.895502 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc4nh"] Feb 26 21:00:45 crc kubenswrapper[4722]: I0226 21:00:45.523333 4722 generic.go:334] "Generic (PLEG): container finished" podID="eb4728ba-33a7-4284-9504-99dc3457b511" containerID="12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710" exitCode=0 Feb 26 21:00:45 crc kubenswrapper[4722]: I0226 21:00:45.523646 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerDied","Data":"12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710"} Feb 26 21:00:45 crc kubenswrapper[4722]: I0226 21:00:45.523677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerStarted","Data":"fd8e2e2ec27c80220fb17daf5b9497a036dae7d867af42a530686c1c17e3fe11"} Feb 26 21:00:47 crc kubenswrapper[4722]: I0226 21:00:47.543584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerStarted","Data":"75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5"} Feb 26 21:00:51 crc kubenswrapper[4722]: I0226 21:00:51.578854 4722 generic.go:334] "Generic (PLEG): container finished" podID="eb4728ba-33a7-4284-9504-99dc3457b511" containerID="75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5" exitCode=0 Feb 26 21:00:51 crc kubenswrapper[4722]: I0226 21:00:51.578962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerDied","Data":"75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5"} Feb 26 21:00:52 crc kubenswrapper[4722]: I0226 21:00:52.603447 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerStarted","Data":"6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf"} Feb 26 21:00:52 crc kubenswrapper[4722]: I0226 21:00:52.630152 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nc4nh" podStartSLOduration=3.057048099 podStartE2EDuration="9.630115111s" podCreationTimestamp="2026-02-26 21:00:43 +0000 UTC" firstStartedPulling="2026-02-26 21:00:45.525673744 +0000 UTC m=+3988.062641668" lastFinishedPulling="2026-02-26 21:00:52.098740766 +0000 UTC m=+3994.635708680" observedRunningTime="2026-02-26 21:00:52.628379994 +0000 UTC m=+3995.165347918" watchObservedRunningTime="2026-02-26 21:00:52.630115111 +0000 UTC m=+3995.167083045" Feb 26 21:00:54 crc kubenswrapper[4722]: I0226 21:00:54.331513 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:54 crc kubenswrapper[4722]: I0226 21:00:54.332396 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:55 crc kubenswrapper[4722]: I0226 21:00:55.408522 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nc4nh" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="registry-server" probeResult="failure" output=< Feb 26 21:00:55 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 21:00:55 crc kubenswrapper[4722]: > Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.161504 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29535661-vcrfc"] Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.163203 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.172570 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535661-vcrfc"] Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.229260 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5zgn\" (UniqueName: \"kubernetes.io/projected/151939f6-066a-4a0b-baee-b378fa58b4e6-kube-api-access-l5zgn\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.229591 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-fernet-keys\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.229850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-combined-ca-bundle\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.229982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-config-data\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.332108 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-combined-ca-bundle\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.332224 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-config-data\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.332318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5zgn\" (UniqueName: \"kubernetes.io/projected/151939f6-066a-4a0b-baee-b378fa58b4e6-kube-api-access-l5zgn\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.332357 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-fernet-keys\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.338565 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-combined-ca-bundle\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.339214 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-fernet-keys\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.342487 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-config-data\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.351126 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5zgn\" (UniqueName: \"kubernetes.io/projected/151939f6-066a-4a0b-baee-b378fa58b4e6-kube-api-access-l5zgn\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.494460 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:01 crc kubenswrapper[4722]: I0226 21:01:01.001453 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535661-vcrfc"] Feb 26 21:01:01 crc kubenswrapper[4722]: I0226 21:01:01.688455 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535661-vcrfc" event={"ID":"151939f6-066a-4a0b-baee-b378fa58b4e6","Type":"ContainerStarted","Data":"c0cf26b492c06e445552c3eb89e4ea386f0a2b17b438748961decadc8c968cfc"} Feb 26 21:01:01 crc kubenswrapper[4722]: I0226 21:01:01.690407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535661-vcrfc" event={"ID":"151939f6-066a-4a0b-baee-b378fa58b4e6","Type":"ContainerStarted","Data":"7ddbb79b76e4db7df35df0d3c69e42ae4e78397f9958dfd3d79bba0762173144"} Feb 26 21:01:01 crc kubenswrapper[4722]: I0226 21:01:01.724853 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29535661-vcrfc" podStartSLOduration=1.724830493 podStartE2EDuration="1.724830493s" podCreationTimestamp="2026-02-26 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 21:01:01.713195578 +0000 UTC m=+4004.250163522" watchObservedRunningTime="2026-02-26 21:01:01.724830493 +0000 UTC m=+4004.261798407" Feb 26 21:01:04 crc kubenswrapper[4722]: I0226 21:01:04.379581 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:01:04 crc kubenswrapper[4722]: I0226 21:01:04.451442 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:01:04 crc kubenswrapper[4722]: I0226 21:01:04.613109 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc4nh"] Feb 26 21:01:04 crc kubenswrapper[4722]: I0226 21:01:04.716007 4722 generic.go:334] "Generic (PLEG): container finished" podID="151939f6-066a-4a0b-baee-b378fa58b4e6" containerID="c0cf26b492c06e445552c3eb89e4ea386f0a2b17b438748961decadc8c968cfc" exitCode=0 Feb 26 21:01:04 crc kubenswrapper[4722]: I0226 21:01:04.716083 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535661-vcrfc" event={"ID":"151939f6-066a-4a0b-baee-b378fa58b4e6","Type":"ContainerDied","Data":"c0cf26b492c06e445552c3eb89e4ea386f0a2b17b438748961decadc8c968cfc"} Feb 26 21:01:05 crc kubenswrapper[4722]: I0226 21:01:05.724390 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nc4nh" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="registry-server" containerID="cri-o://6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf" gracePeriod=2 Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.308372 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.424756 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.460799 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-config-data\") pod \"151939f6-066a-4a0b-baee-b378fa58b4e6\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.460981 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5zgn\" (UniqueName: \"kubernetes.io/projected/151939f6-066a-4a0b-baee-b378fa58b4e6-kube-api-access-l5zgn\") pod \"151939f6-066a-4a0b-baee-b378fa58b4e6\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.461785 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-fernet-keys\") pod \"151939f6-066a-4a0b-baee-b378fa58b4e6\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.461811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-combined-ca-bundle\") pod \"151939f6-066a-4a0b-baee-b378fa58b4e6\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.472348 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151939f6-066a-4a0b-baee-b378fa58b4e6-kube-api-access-l5zgn" (OuterVolumeSpecName: "kube-api-access-l5zgn") pod "151939f6-066a-4a0b-baee-b378fa58b4e6" (UID: "151939f6-066a-4a0b-baee-b378fa58b4e6"). InnerVolumeSpecName "kube-api-access-l5zgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.477311 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "151939f6-066a-4a0b-baee-b378fa58b4e6" (UID: "151939f6-066a-4a0b-baee-b378fa58b4e6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.547986 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "151939f6-066a-4a0b-baee-b378fa58b4e6" (UID: "151939f6-066a-4a0b-baee-b378fa58b4e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.566180 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pcpz\" (UniqueName: \"kubernetes.io/projected/eb4728ba-33a7-4284-9504-99dc3457b511-kube-api-access-7pcpz\") pod \"eb4728ba-33a7-4284-9504-99dc3457b511\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.566286 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-utilities\") pod \"eb4728ba-33a7-4284-9504-99dc3457b511\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.566326 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-catalog-content\") pod \"eb4728ba-33a7-4284-9504-99dc3457b511\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.566732 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5zgn\" (UniqueName: \"kubernetes.io/projected/151939f6-066a-4a0b-baee-b378fa58b4e6-kube-api-access-l5zgn\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.566742 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.566750 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.568405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-utilities" (OuterVolumeSpecName: "utilities") pod "eb4728ba-33a7-4284-9504-99dc3457b511" (UID: "eb4728ba-33a7-4284-9504-99dc3457b511"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.571394 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4728ba-33a7-4284-9504-99dc3457b511-kube-api-access-7pcpz" (OuterVolumeSpecName: "kube-api-access-7pcpz") pod "eb4728ba-33a7-4284-9504-99dc3457b511" (UID: "eb4728ba-33a7-4284-9504-99dc3457b511"). InnerVolumeSpecName "kube-api-access-7pcpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.574237 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-config-data" (OuterVolumeSpecName: "config-data") pod "151939f6-066a-4a0b-baee-b378fa58b4e6" (UID: "151939f6-066a-4a0b-baee-b378fa58b4e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.670684 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.670927 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pcpz\" (UniqueName: \"kubernetes.io/projected/eb4728ba-33a7-4284-9504-99dc3457b511-kube-api-access-7pcpz\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.671000 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.715706 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb4728ba-33a7-4284-9504-99dc3457b511" (UID: "eb4728ba-33a7-4284-9504-99dc3457b511"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.736234 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535661-vcrfc" event={"ID":"151939f6-066a-4a0b-baee-b378fa58b4e6","Type":"ContainerDied","Data":"7ddbb79b76e4db7df35df0d3c69e42ae4e78397f9958dfd3d79bba0762173144"} Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.736273 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ddbb79b76e4db7df35df0d3c69e42ae4e78397f9958dfd3d79bba0762173144" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.736328 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.746970 4722 generic.go:334] "Generic (PLEG): container finished" podID="eb4728ba-33a7-4284-9504-99dc3457b511" containerID="6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf" exitCode=0 Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.747024 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerDied","Data":"6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf"} Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.747058 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerDied","Data":"fd8e2e2ec27c80220fb17daf5b9497a036dae7d867af42a530686c1c17e3fe11"} Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.747075 4722 scope.go:117] "RemoveContainer" containerID="6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.747233 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.773574 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.800279 4722 scope.go:117] "RemoveContainer" containerID="75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.834098 4722 scope.go:117] "RemoveContainer" containerID="12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.838066 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc4nh"] Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.850294 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nc4nh"] Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.911291 4722 scope.go:117] "RemoveContainer" containerID="6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf" Feb 26 21:01:06 crc kubenswrapper[4722]: E0226 21:01:06.914579 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf\": container with ID starting with 6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf not found: ID does not exist" containerID="6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.914619 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf"} err="failed to get container status \"6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf\": rpc error: code = NotFound desc = could not find container \"6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf\": container with ID starting with 6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf not found: ID does not exist" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.914646 4722 scope.go:117] "RemoveContainer" containerID="75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5" Feb 26 21:01:06 crc kubenswrapper[4722]: E0226 21:01:06.914935 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5\": container with ID starting with 75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5 not found: ID does not exist" containerID="75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.914956 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5"} err="failed to get container status \"75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5\": rpc error: code = NotFound desc = could not find container \"75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5\": container with ID starting with 75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5 not found: ID does not exist" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.914970 4722 scope.go:117] "RemoveContainer" containerID="12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710" Feb 26 21:01:06 crc kubenswrapper[4722]: E0226 21:01:06.915587 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710\": container with ID starting with 12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710 not found: ID does not exist" containerID="12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.915618 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710"} err="failed to get container status \"12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710\": rpc error: code = NotFound desc = could not find container \"12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710\": container with ID starting with 12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710 not found: ID does not exist" Feb 26 21:01:08 crc kubenswrapper[4722]: I0226 21:01:08.158847 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" path="/var/lib/kubelet/pods/eb4728ba-33a7-4284-9504-99dc3457b511/volumes" Feb 26 21:01:29 crc kubenswrapper[4722]: I0226 21:01:29.801568 4722 scope.go:117] "RemoveContainer" containerID="b96ab482157a01b485510d5127ce825cdd3b6d82cdcbe56e073e5d108a61889c" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.165280 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535662-pb2zv"] Feb 26 21:02:00 crc kubenswrapper[4722]: E0226 21:02:00.166216 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="extract-utilities" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.166229 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="extract-utilities" Feb 26 21:02:00 crc kubenswrapper[4722]: E0226 21:02:00.166251 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="registry-server" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.166259 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="registry-server" Feb 26 21:02:00 crc kubenswrapper[4722]: E0226 21:02:00.166268 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="extract-content" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.166276 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="extract-content" Feb 26 21:02:00 crc kubenswrapper[4722]: E0226 21:02:00.166294 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151939f6-066a-4a0b-baee-b378fa58b4e6" containerName="keystone-cron" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.166299 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="151939f6-066a-4a0b-baee-b378fa58b4e6" containerName="keystone-cron" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.166503 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="151939f6-066a-4a0b-baee-b378fa58b4e6" containerName="keystone-cron" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.166512 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="registry-server" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.167230 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.170433 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.170659 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.170850 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.199747 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535662-pb2zv"] Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.238526 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z58ph\" (UniqueName: \"kubernetes.io/projected/85353cc8-0b88-4e2a-8442-6599665e4037-kube-api-access-z58ph\") pod \"auto-csr-approver-29535662-pb2zv\" (UID: \"85353cc8-0b88-4e2a-8442-6599665e4037\") " pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.340186 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z58ph\" (UniqueName: \"kubernetes.io/projected/85353cc8-0b88-4e2a-8442-6599665e4037-kube-api-access-z58ph\") pod \"auto-csr-approver-29535662-pb2zv\" (UID: \"85353cc8-0b88-4e2a-8442-6599665e4037\") " pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.373243 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z58ph\" (UniqueName: \"kubernetes.io/projected/85353cc8-0b88-4e2a-8442-6599665e4037-kube-api-access-z58ph\") pod \"auto-csr-approver-29535662-pb2zv\" (UID: \"85353cc8-0b88-4e2a-8442-6599665e4037\") " pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.493412 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.961048 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535662-pb2zv"] Feb 26 21:02:01 crc kubenswrapper[4722]: I0226 21:02:01.281866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" event={"ID":"85353cc8-0b88-4e2a-8442-6599665e4037","Type":"ContainerStarted","Data":"61a9aa0892038217e2a2c77c6e59bd77881559a00cf888613f4ae57f22475604"} Feb 26 21:02:03 crc kubenswrapper[4722]: I0226 21:02:03.343665 4722 generic.go:334] "Generic (PLEG): container finished" podID="85353cc8-0b88-4e2a-8442-6599665e4037" containerID="6e7c09807a4f94fb7ad87f8ac0745d3c1704d5f90a58e16791661fcc845d89b6" exitCode=0 Feb 26 21:02:03 crc kubenswrapper[4722]: I0226 21:02:03.343727 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" event={"ID":"85353cc8-0b88-4e2a-8442-6599665e4037","Type":"ContainerDied","Data":"6e7c09807a4f94fb7ad87f8ac0745d3c1704d5f90a58e16791661fcc845d89b6"} Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.013807 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.152124 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z58ph\" (UniqueName: \"kubernetes.io/projected/85353cc8-0b88-4e2a-8442-6599665e4037-kube-api-access-z58ph\") pod \"85353cc8-0b88-4e2a-8442-6599665e4037\" (UID: \"85353cc8-0b88-4e2a-8442-6599665e4037\") " Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.159529 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85353cc8-0b88-4e2a-8442-6599665e4037-kube-api-access-z58ph" (OuterVolumeSpecName: "kube-api-access-z58ph") pod "85353cc8-0b88-4e2a-8442-6599665e4037" (UID: "85353cc8-0b88-4e2a-8442-6599665e4037"). InnerVolumeSpecName "kube-api-access-z58ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.254369 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z58ph\" (UniqueName: \"kubernetes.io/projected/85353cc8-0b88-4e2a-8442-6599665e4037-kube-api-access-z58ph\") on node \"crc\" DevicePath \"\"" Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.374756 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" event={"ID":"85353cc8-0b88-4e2a-8442-6599665e4037","Type":"ContainerDied","Data":"61a9aa0892038217e2a2c77c6e59bd77881559a00cf888613f4ae57f22475604"} Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.374797 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61a9aa0892038217e2a2c77c6e59bd77881559a00cf888613f4ae57f22475604" Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.374863 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:06 crc kubenswrapper[4722]: I0226 21:02:06.075168 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535656-x5kvk"] Feb 26 21:02:06 crc kubenswrapper[4722]: I0226 21:02:06.085750 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535656-x5kvk"] Feb 26 21:02:06 crc kubenswrapper[4722]: I0226 21:02:06.159087 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d81e072-7a00-4b3c-b823-692d3817a4a6" path="/var/lib/kubelet/pods/6d81e072-7a00-4b3c-b823-692d3817a4a6/volumes" Feb 26 21:02:06 crc kubenswrapper[4722]: I0226 21:02:06.385336 4722 generic.go:334] "Generic (PLEG): container finished" podID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerID="6d0bdad11f63ded72de1a9fdfd2f5219a998239c5f98345a96df88beb067d8df" exitCode=0 Feb 26 21:02:06 crc kubenswrapper[4722]: I0226 21:02:06.385378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" event={"ID":"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef","Type":"ContainerDied","Data":"6d0bdad11f63ded72de1a9fdfd2f5219a998239c5f98345a96df88beb067d8df"} Feb 26 21:02:06 crc kubenswrapper[4722]: I0226 21:02:06.386002 4722 scope.go:117] "RemoveContainer" containerID="6d0bdad11f63ded72de1a9fdfd2f5219a998239c5f98345a96df88beb067d8df" Feb 26 21:02:07 crc kubenswrapper[4722]: I0226 21:02:07.040595 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v9pkb_must-gather-cl4sw_e9e2788f-e6cf-4e11-8355-3eaaa576c3ef/gather/0.log" Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.005764 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v9pkb/must-gather-cl4sw"] Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.006499 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="copy" containerID="cri-o://cf859242e6b85e5c5ff11aa2779a6c6b726c1832d4fe61146bc5fec28bde1fba" gracePeriod=2 Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.019929 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v9pkb/must-gather-cl4sw"] Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.497059 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v9pkb_must-gather-cl4sw_e9e2788f-e6cf-4e11-8355-3eaaa576c3ef/copy/0.log" Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.500843 4722 generic.go:334] "Generic (PLEG): container finished" podID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerID="cf859242e6b85e5c5ff11aa2779a6c6b726c1832d4fe61146bc5fec28bde1fba" exitCode=143 Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.737489 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v9pkb_must-gather-cl4sw_e9e2788f-e6cf-4e11-8355-3eaaa576c3ef/copy/0.log" Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.738320 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.906927 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-must-gather-output\") pod \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.907017 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4m9p\" (UniqueName: \"kubernetes.io/projected/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-kube-api-access-s4m9p\") pod \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.927663 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-kube-api-access-s4m9p" (OuterVolumeSpecName: "kube-api-access-s4m9p") pod "e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" (UID: "e9e2788f-e6cf-4e11-8355-3eaaa576c3ef"). InnerVolumeSpecName "kube-api-access-s4m9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.011919 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4m9p\" (UniqueName: \"kubernetes.io/projected/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-kube-api-access-s4m9p\") on node \"crc\" DevicePath \"\"" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.133100 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" (UID: "e9e2788f-e6cf-4e11-8355-3eaaa576c3ef"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.216590 4722 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.512253 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v9pkb_must-gather-cl4sw_e9e2788f-e6cf-4e11-8355-3eaaa576c3ef/copy/0.log" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.513518 4722 scope.go:117] "RemoveContainer" containerID="cf859242e6b85e5c5ff11aa2779a6c6b726c1832d4fe61146bc5fec28bde1fba" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.513533 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.549313 4722 scope.go:117] "RemoveContainer" containerID="6d0bdad11f63ded72de1a9fdfd2f5219a998239c5f98345a96df88beb067d8df" Feb 26 21:02:18 crc kubenswrapper[4722]: I0226 21:02:18.163516 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" path="/var/lib/kubelet/pods/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef/volumes" Feb 26 21:02:23 crc kubenswrapper[4722]: I0226 21:02:23.487317 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 21:02:23 crc kubenswrapper[4722]: I0226 21:02:23.487910 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 21:02:29 crc kubenswrapper[4722]: I0226 21:02:29.873034 4722 scope.go:117] "RemoveContainer" containerID="bdd7c3bf285ec366272d6e3f20642936db9fcc7be861fd32d58118457e4f934f" Feb 26 21:02:29 crc kubenswrapper[4722]: I0226 21:02:29.929420 4722 scope.go:117] "RemoveContainer" containerID="832d0e1df420009c53cd27587c0296e2650039bcfaf51e81797c2e554d229c02" Feb 26 21:02:53 crc kubenswrapper[4722]: I0226 21:02:53.487685 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 21:02:53 crc kubenswrapper[4722]: I0226 21:02:53.488546 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 21:03:23 crc kubenswrapper[4722]: I0226 21:03:23.487269 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 21:03:23 crc kubenswrapper[4722]: I0226 21:03:23.487691 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 21:03:23 crc kubenswrapper[4722]: I0226 21:03:23.487730 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 21:03:23 crc kubenswrapper[4722]: I0226 21:03:23.488489 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16ec12d6b5bec63a6526ad9b6c9c476723f1f33b7f2af892b8071e40154eee61"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 21:03:23 crc kubenswrapper[4722]: I0226 21:03:23.488531 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://16ec12d6b5bec63a6526ad9b6c9c476723f1f33b7f2af892b8071e40154eee61" gracePeriod=600 Feb 26 21:03:24 crc kubenswrapper[4722]: I0226 21:03:24.415482 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="16ec12d6b5bec63a6526ad9b6c9c476723f1f33b7f2af892b8071e40154eee61" exitCode=0 Feb 26 21:03:24 crc kubenswrapper[4722]: I0226 21:03:24.416114 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"16ec12d6b5bec63a6526ad9b6c9c476723f1f33b7f2af892b8071e40154eee61"} Feb 26 21:03:24 crc kubenswrapper[4722]: I0226 21:03:24.416171 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77"} Feb 26 21:03:24 crc kubenswrapper[4722]: I0226 21:03:24.416193 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.571040 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jd4qc"] Feb 26 21:03:30 crc kubenswrapper[4722]: E0226 21:03:30.572098 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85353cc8-0b88-4e2a-8442-6599665e4037" containerName="oc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.572116 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="85353cc8-0b88-4e2a-8442-6599665e4037" containerName="oc" Feb 26 21:03:30 crc kubenswrapper[4722]: E0226 21:03:30.572126 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="gather" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.572148 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="gather" Feb 26 21:03:30 crc kubenswrapper[4722]: E0226 21:03:30.572156 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="copy" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.572162 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="copy" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.572410 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="copy" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.572438 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="gather" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.572457 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="85353cc8-0b88-4e2a-8442-6599665e4037" containerName="oc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.574181 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.592284 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jd4qc"] Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.664057 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-catalog-content\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.664545 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-utilities\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.664678 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8wg5\" (UniqueName: \"kubernetes.io/projected/cc03b4fa-b901-494b-8384-3cd16e437bc3-kube-api-access-g8wg5\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.766524 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-utilities\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.766637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8wg5\" (UniqueName: \"kubernetes.io/projected/cc03b4fa-b901-494b-8384-3cd16e437bc3-kube-api-access-g8wg5\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.766698 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-catalog-content\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.767093 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-utilities\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.767152 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-catalog-content\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.798204 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8wg5\" (UniqueName: \"kubernetes.io/projected/cc03b4fa-b901-494b-8384-3cd16e437bc3-kube-api-access-g8wg5\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.915359 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:31 crc kubenswrapper[4722]: I0226 21:03:31.539756 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jd4qc"] Feb 26 21:03:32 crc kubenswrapper[4722]: I0226 21:03:32.494093 4722 generic.go:334] "Generic (PLEG): container finished" podID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerID="df47ca1ccf58a0223c5321b94afff080ae61062c88a6927837c239d37aff25bd" exitCode=0 Feb 26 21:03:32 crc kubenswrapper[4722]: I0226 21:03:32.494180 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerDied","Data":"df47ca1ccf58a0223c5321b94afff080ae61062c88a6927837c239d37aff25bd"} Feb 26 21:03:32 crc kubenswrapper[4722]: I0226 21:03:32.494835 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerStarted","Data":"ea26475a83c32d2d3ace26c5048229a128439aad73761cc2dcdb96ed9be127eb"} Feb 26 21:03:33 crc kubenswrapper[4722]: I0226 21:03:33.506491 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerStarted","Data":"9fccbbad292c6665e576744f620e3d7dcda0ee68f54c652e59c93cc7c9b3f247"} Feb 26 21:03:35 crc kubenswrapper[4722]: I0226 21:03:35.542727 4722 generic.go:334] "Generic (PLEG): container finished" podID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerID="9fccbbad292c6665e576744f620e3d7dcda0ee68f54c652e59c93cc7c9b3f247" exitCode=0 Feb 26 21:03:35 crc kubenswrapper[4722]: I0226 21:03:35.543033 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerDied","Data":"9fccbbad292c6665e576744f620e3d7dcda0ee68f54c652e59c93cc7c9b3f247"} Feb 26 21:03:36 crc kubenswrapper[4722]: I0226 21:03:36.555216 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerStarted","Data":"691356171d214141f640022c98f0b842f21b149eb6ad08d9ecb614eeb15bae7f"} Feb 26 21:03:36 crc kubenswrapper[4722]: I0226 21:03:36.579053 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jd4qc" podStartSLOduration=3.114318527 podStartE2EDuration="6.579034945s" podCreationTimestamp="2026-02-26 21:03:30 +0000 UTC" firstStartedPulling="2026-02-26 21:03:32.496705221 +0000 UTC m=+4155.033673145" lastFinishedPulling="2026-02-26 21:03:35.961421639 +0000 UTC m=+4158.498389563" observedRunningTime="2026-02-26 21:03:36.570541895 +0000 UTC m=+4159.107509819" watchObservedRunningTime="2026-02-26 21:03:36.579034945 +0000 UTC m=+4159.116002869" Feb 26 21:03:40 crc kubenswrapper[4722]: I0226 21:03:40.916756 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:40 crc kubenswrapper[4722]: I0226 21:03:40.917312 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:40 crc kubenswrapper[4722]: I0226 21:03:40.977299 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:41 crc kubenswrapper[4722]: I0226 21:03:41.659151 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:41 crc kubenswrapper[4722]: I0226 21:03:41.731011 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jd4qc"] Feb 26 21:03:43 crc kubenswrapper[4722]: I0226 21:03:43.615126 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jd4qc" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="registry-server" containerID="cri-o://691356171d214141f640022c98f0b842f21b149eb6ad08d9ecb614eeb15bae7f" gracePeriod=2 Feb 26 21:03:43 crc kubenswrapper[4722]: E0226 21:03:43.747246 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc03b4fa_b901_494b_8384_3cd16e437bc3.slice/crio-691356171d214141f640022c98f0b842f21b149eb6ad08d9ecb614eeb15bae7f.scope\": RecentStats: unable to find data in memory cache]" Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.625913 4722 generic.go:334] "Generic (PLEG): container finished" podID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerID="691356171d214141f640022c98f0b842f21b149eb6ad08d9ecb614eeb15bae7f" exitCode=0 Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.625978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerDied","Data":"691356171d214141f640022c98f0b842f21b149eb6ad08d9ecb614eeb15bae7f"} Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.822260 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.982668 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8wg5\" (UniqueName: \"kubernetes.io/projected/cc03b4fa-b901-494b-8384-3cd16e437bc3-kube-api-access-g8wg5\") pod \"cc03b4fa-b901-494b-8384-3cd16e437bc3\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.982751 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-utilities\") pod \"cc03b4fa-b901-494b-8384-3cd16e437bc3\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.982809 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-catalog-content\") pod \"cc03b4fa-b901-494b-8384-3cd16e437bc3\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.983809 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-utilities" (OuterVolumeSpecName: "utilities") pod "cc03b4fa-b901-494b-8384-3cd16e437bc3" (UID: "cc03b4fa-b901-494b-8384-3cd16e437bc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.998557 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc03b4fa-b901-494b-8384-3cd16e437bc3-kube-api-access-g8wg5" (OuterVolumeSpecName: "kube-api-access-g8wg5") pod "cc03b4fa-b901-494b-8384-3cd16e437bc3" (UID: "cc03b4fa-b901-494b-8384-3cd16e437bc3"). InnerVolumeSpecName "kube-api-access-g8wg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.035991 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc03b4fa-b901-494b-8384-3cd16e437bc3" (UID: "cc03b4fa-b901-494b-8384-3cd16e437bc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.085493 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8wg5\" (UniqueName: \"kubernetes.io/projected/cc03b4fa-b901-494b-8384-3cd16e437bc3-kube-api-access-g8wg5\") on node \"crc\" DevicePath \"\"" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.085748 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.085850 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.637117 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerDied","Data":"ea26475a83c32d2d3ace26c5048229a128439aad73761cc2dcdb96ed9be127eb"} Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.637192 4722 scope.go:117] "RemoveContainer" containerID="691356171d214141f640022c98f0b842f21b149eb6ad08d9ecb614eeb15bae7f" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.637258 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.655501 4722 scope.go:117] "RemoveContainer" containerID="9fccbbad292c6665e576744f620e3d7dcda0ee68f54c652e59c93cc7c9b3f247" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.672984 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jd4qc"] Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.679641 4722 scope.go:117] "RemoveContainer" containerID="df47ca1ccf58a0223c5321b94afff080ae61062c88a6927837c239d37aff25bd" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.682082 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jd4qc"] Feb 26 21:03:46 crc kubenswrapper[4722]: I0226 21:03:46.158184 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" path="/var/lib/kubelet/pods/cc03b4fa-b901-494b-8384-3cd16e437bc3/volumes" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.145432 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535664-q9kjw"] Feb 26 21:04:00 crc kubenswrapper[4722]: E0226 21:04:00.146302 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="extract-content" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.146316 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="extract-content" Feb 26 21:04:00 crc kubenswrapper[4722]: E0226 21:04:00.146338 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="extract-utilities" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.146344 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="extract-utilities" Feb 26 21:04:00 crc kubenswrapper[4722]: E0226 21:04:00.146354 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="registry-server" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.146360 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="registry-server" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.146563 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="registry-server" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.147459 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.152089 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.153015 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.158455 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.161646 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535664-q9kjw"] Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.205739 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j4fb\" (UniqueName: \"kubernetes.io/projected/f6e99d3c-7e5b-4dee-ae88-b886e323ff9f-kube-api-access-9j4fb\") pod \"auto-csr-approver-29535664-q9kjw\" (UID: \"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f\") " pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.307494 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j4fb\" (UniqueName: \"kubernetes.io/projected/f6e99d3c-7e5b-4dee-ae88-b886e323ff9f-kube-api-access-9j4fb\") pod \"auto-csr-approver-29535664-q9kjw\" (UID: \"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f\") " pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.326002 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j4fb\" (UniqueName: \"kubernetes.io/projected/f6e99d3c-7e5b-4dee-ae88-b886e323ff9f-kube-api-access-9j4fb\") pod \"auto-csr-approver-29535664-q9kjw\" (UID: \"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f\") " pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.473210 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.945037 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535664-q9kjw"] Feb 26 21:04:01 crc kubenswrapper[4722]: I0226 21:04:01.816690 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" event={"ID":"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f","Type":"ContainerStarted","Data":"2cde6df4486fcab4b938c772313081432b9e10e63e23aa829887e1fb8a8e3e15"} Feb 26 21:04:02 crc kubenswrapper[4722]: I0226 21:04:02.840004 4722 generic.go:334] "Generic (PLEG): container finished" podID="f6e99d3c-7e5b-4dee-ae88-b886e323ff9f" containerID="266f5c20f9eb286aa461bb2f7339106c56b0ab99df1ee88ec448239c726831b2" exitCode=0 Feb 26 21:04:02 crc kubenswrapper[4722]: I0226 21:04:02.840122 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" event={"ID":"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f","Type":"ContainerDied","Data":"266f5c20f9eb286aa461bb2f7339106c56b0ab99df1ee88ec448239c726831b2"} Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.376356 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.489395 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j4fb\" (UniqueName: \"kubernetes.io/projected/f6e99d3c-7e5b-4dee-ae88-b886e323ff9f-kube-api-access-9j4fb\") pod \"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f\" (UID: \"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f\") " Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.501566 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6e99d3c-7e5b-4dee-ae88-b886e323ff9f-kube-api-access-9j4fb" (OuterVolumeSpecName: "kube-api-access-9j4fb") pod "f6e99d3c-7e5b-4dee-ae88-b886e323ff9f" (UID: "f6e99d3c-7e5b-4dee-ae88-b886e323ff9f"). InnerVolumeSpecName "kube-api-access-9j4fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.591331 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j4fb\" (UniqueName: \"kubernetes.io/projected/f6e99d3c-7e5b-4dee-ae88-b886e323ff9f-kube-api-access-9j4fb\") on node \"crc\" DevicePath \"\"" Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.858617 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" event={"ID":"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f","Type":"ContainerDied","Data":"2cde6df4486fcab4b938c772313081432b9e10e63e23aa829887e1fb8a8e3e15"} Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.858663 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cde6df4486fcab4b938c772313081432b9e10e63e23aa829887e1fb8a8e3e15" Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.858662 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:05 crc kubenswrapper[4722]: I0226 21:04:05.464366 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535658-4gwfp"] Feb 26 21:04:05 crc kubenswrapper[4722]: I0226 21:04:05.478959 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535658-4gwfp"] Feb 26 21:04:06 crc kubenswrapper[4722]: I0226 21:04:06.160362 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923" path="/var/lib/kubelet/pods/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923/volumes" Feb 26 21:04:30 crc kubenswrapper[4722]: I0226 21:04:30.078473 4722 scope.go:117] "RemoveContainer" containerID="32347a701ba48be3c77ad3fab882caef4ee3129a888dfa7eb7f09f79ccbff2e8" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.399422 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j2p6w"] Feb 26 21:04:32 crc kubenswrapper[4722]: E0226 21:04:32.400424 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e99d3c-7e5b-4dee-ae88-b886e323ff9f" containerName="oc" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.400441 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e99d3c-7e5b-4dee-ae88-b886e323ff9f" containerName="oc" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.400683 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6e99d3c-7e5b-4dee-ae88-b886e323ff9f" containerName="oc" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.402296 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.413145 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j2p6w"] Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.502816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-catalog-content\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.502936 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-utilities\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.502999 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sf6f\" (UniqueName: \"kubernetes.io/projected/6f961df0-1523-4bff-a96c-3869df797d0b-kube-api-access-5sf6f\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.604738 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-utilities\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.604811 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sf6f\" (UniqueName: \"kubernetes.io/projected/6f961df0-1523-4bff-a96c-3869df797d0b-kube-api-access-5sf6f\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.604991 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-catalog-content\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.605511 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-catalog-content\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.605513 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-utilities\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.625031 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sf6f\" (UniqueName: \"kubernetes.io/projected/6f961df0-1523-4bff-a96c-3869df797d0b-kube-api-access-5sf6f\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.724712 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:33 crc kubenswrapper[4722]: I0226 21:04:33.172496 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j2p6w"] Feb 26 21:04:34 crc kubenswrapper[4722]: I0226 21:04:34.376242 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f961df0-1523-4bff-a96c-3869df797d0b" containerID="2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f" exitCode=0 Feb 26 21:04:34 crc kubenswrapper[4722]: I0226 21:04:34.377605 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerDied","Data":"2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f"} Feb 26 21:04:34 crc kubenswrapper[4722]: I0226 21:04:34.377691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerStarted","Data":"c5846720cbf8c732dd7749a4373a29ca515241aa71f8529a40b0cdf14ee71030"} Feb 26 21:04:35 crc kubenswrapper[4722]: I0226 21:04:35.389659 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerStarted","Data":"f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1"} Feb 26 21:04:37 crc kubenswrapper[4722]: I0226 21:04:37.406820 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f961df0-1523-4bff-a96c-3869df797d0b" containerID="f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1" exitCode=0 Feb 26 21:04:37 crc kubenswrapper[4722]: I0226 21:04:37.406891 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerDied","Data":"f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1"} Feb 26 21:04:38 crc kubenswrapper[4722]: I0226 21:04:38.419567 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerStarted","Data":"4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9"} Feb 26 21:04:38 crc kubenswrapper[4722]: I0226 21:04:38.443980 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j2p6w" podStartSLOduration=2.808838851 podStartE2EDuration="6.443955202s" podCreationTimestamp="2026-02-26 21:04:32 +0000 UTC" firstStartedPulling="2026-02-26 21:04:34.380654064 +0000 UTC m=+4216.917621988" lastFinishedPulling="2026-02-26 21:04:38.015770415 +0000 UTC m=+4220.552738339" observedRunningTime="2026-02-26 21:04:38.435936097 +0000 UTC m=+4220.972904031" watchObservedRunningTime="2026-02-26 21:04:38.443955202 +0000 UTC m=+4220.980923136" Feb 26 21:04:42 crc kubenswrapper[4722]: I0226 21:04:42.725648 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:42 crc kubenswrapper[4722]: I0226 21:04:42.726215 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:42 crc kubenswrapper[4722]: I0226 21:04:42.768494 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:43 crc kubenswrapper[4722]: I0226 21:04:43.523965 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:43 crc kubenswrapper[4722]: I0226 21:04:43.573652 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j2p6w"] Feb 26 21:04:45 crc kubenswrapper[4722]: I0226 21:04:45.484217 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j2p6w" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="registry-server" containerID="cri-o://4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9" gracePeriod=2 Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.124644 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.208618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-utilities\") pod \"6f961df0-1523-4bff-a96c-3869df797d0b\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.208859 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sf6f\" (UniqueName: \"kubernetes.io/projected/6f961df0-1523-4bff-a96c-3869df797d0b-kube-api-access-5sf6f\") pod \"6f961df0-1523-4bff-a96c-3869df797d0b\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.208929 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-catalog-content\") pod \"6f961df0-1523-4bff-a96c-3869df797d0b\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.209682 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-utilities" (OuterVolumeSpecName: "utilities") pod "6f961df0-1523-4bff-a96c-3869df797d0b" (UID: "6f961df0-1523-4bff-a96c-3869df797d0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.215378 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f961df0-1523-4bff-a96c-3869df797d0b-kube-api-access-5sf6f" (OuterVolumeSpecName: "kube-api-access-5sf6f") pod "6f961df0-1523-4bff-a96c-3869df797d0b" (UID: "6f961df0-1523-4bff-a96c-3869df797d0b"). InnerVolumeSpecName "kube-api-access-5sf6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.264270 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f961df0-1523-4bff-a96c-3869df797d0b" (UID: "6f961df0-1523-4bff-a96c-3869df797d0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.315907 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sf6f\" (UniqueName: \"kubernetes.io/projected/6f961df0-1523-4bff-a96c-3869df797d0b-kube-api-access-5sf6f\") on node \"crc\" DevicePath \"\"" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.315946 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.315958 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.494716 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f961df0-1523-4bff-a96c-3869df797d0b" containerID="4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9" exitCode=0 Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.494775 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.494770 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerDied","Data":"4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9"} Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.494893 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerDied","Data":"c5846720cbf8c732dd7749a4373a29ca515241aa71f8529a40b0cdf14ee71030"} Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.494914 4722 scope.go:117] "RemoveContainer" containerID="4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.514195 4722 scope.go:117] "RemoveContainer" containerID="f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.530171 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j2p6w"] Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.540402 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j2p6w"] Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.027969 4722 scope.go:117] "RemoveContainer" containerID="2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f" Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.080768 4722 scope.go:117] "RemoveContainer" containerID="4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9" Feb 26 21:04:47 crc kubenswrapper[4722]: E0226 21:04:47.081131 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9\": container with ID starting with 4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9 not found: ID does not exist" containerID="4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9" Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.081248 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9"} err="failed to get container status \"4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9\": rpc error: code = NotFound desc = could not find container \"4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9\": container with ID starting with 4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9 not found: ID does not exist" Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.081275 4722 scope.go:117] "RemoveContainer" containerID="f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1" Feb 26 21:04:47 crc kubenswrapper[4722]: E0226 21:04:47.081756 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1\": container with ID starting with f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1 not found: ID does not exist" containerID="f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1" Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.081783 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1"} err="failed to get container status \"f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1\": rpc error: code = NotFound desc = could not find container \"f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1\": container with ID starting with f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1 not found: ID does not exist" Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.081798 4722 scope.go:117] "RemoveContainer" containerID="2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f" Feb 26 21:04:47 crc kubenswrapper[4722]: E0226 21:04:47.082075 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f\": container with ID starting with 2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f not found: ID does not exist" containerID="2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f" Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.082124 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f"} err="failed to get container status \"2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f\": rpc error: code = NotFound desc = could not find container \"2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f\": container with ID starting with 2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f not found: ID does not exist" Feb 26 21:04:48 crc kubenswrapper[4722]: I0226 21:04:48.160986 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" path="/var/lib/kubelet/pods/6f961df0-1523-4bff-a96c-3869df797d0b/volumes" Feb 26 21:05:53 crc kubenswrapper[4722]: I0226 21:05:53.487808 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 21:05:53 crc kubenswrapper[4722]: I0226 21:05:53.488450 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.141682 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535666-7b958"] Feb 26 21:06:00 crc kubenswrapper[4722]: E0226 21:06:00.142606 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="extract-utilities" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.142619 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="extract-utilities" Feb 26 21:06:00 crc kubenswrapper[4722]: E0226 21:06:00.142642 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="extract-content" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.142648 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="extract-content" Feb 26 21:06:00 crc kubenswrapper[4722]: E0226 21:06:00.142664 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="registry-server" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.142671 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="registry-server" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.142854 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="registry-server" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.143504 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.147396 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.148100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.157962 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.199519 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535666-7b958"] Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.219476 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-946nc\" (UniqueName: \"kubernetes.io/projected/b1258405-ec09-4b2b-ad14-5710eb5ea82e-kube-api-access-946nc\") pod \"auto-csr-approver-29535666-7b958\" (UID: \"b1258405-ec09-4b2b-ad14-5710eb5ea82e\") " pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.321449 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-946nc\" (UniqueName: \"kubernetes.io/projected/b1258405-ec09-4b2b-ad14-5710eb5ea82e-kube-api-access-946nc\") pod \"auto-csr-approver-29535666-7b958\" (UID: \"b1258405-ec09-4b2b-ad14-5710eb5ea82e\") " pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.602752 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-946nc\" (UniqueName: \"kubernetes.io/projected/b1258405-ec09-4b2b-ad14-5710eb5ea82e-kube-api-access-946nc\") pod \"auto-csr-approver-29535666-7b958\" (UID: \"b1258405-ec09-4b2b-ad14-5710eb5ea82e\") " pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.772660 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:01 crc kubenswrapper[4722]: I0226 21:06:01.268413 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535666-7b958"] Feb 26 21:06:01 crc kubenswrapper[4722]: I0226 21:06:01.275342 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 21:06:02 crc kubenswrapper[4722]: I0226 21:06:02.246998 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535666-7b958" event={"ID":"b1258405-ec09-4b2b-ad14-5710eb5ea82e","Type":"ContainerStarted","Data":"7c44593b3fa1c8924ba4d6b526ce795a0a477c9b246e2615b8633b7535d3f0e2"} Feb 26 21:06:03 crc kubenswrapper[4722]: I0226 21:06:03.256330 4722 generic.go:334] "Generic (PLEG): container finished" podID="b1258405-ec09-4b2b-ad14-5710eb5ea82e" containerID="ebbc7bc54500e257a5f035fdf1dd991ef1c2eb9809e34df00f1203200afcf17e" exitCode=0 Feb 26 21:06:03 crc kubenswrapper[4722]: I0226 21:06:03.256431 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535666-7b958" event={"ID":"b1258405-ec09-4b2b-ad14-5710eb5ea82e","Type":"ContainerDied","Data":"ebbc7bc54500e257a5f035fdf1dd991ef1c2eb9809e34df00f1203200afcf17e"} Feb 26 21:06:04 crc kubenswrapper[4722]: I0226 21:06:04.790255 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:04 crc kubenswrapper[4722]: I0226 21:06:04.912332 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-946nc\" (UniqueName: \"kubernetes.io/projected/b1258405-ec09-4b2b-ad14-5710eb5ea82e-kube-api-access-946nc\") pod \"b1258405-ec09-4b2b-ad14-5710eb5ea82e\" (UID: \"b1258405-ec09-4b2b-ad14-5710eb5ea82e\") " Feb 26 21:06:04 crc kubenswrapper[4722]: I0226 21:06:04.924428 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1258405-ec09-4b2b-ad14-5710eb5ea82e-kube-api-access-946nc" (OuterVolumeSpecName: "kube-api-access-946nc") pod "b1258405-ec09-4b2b-ad14-5710eb5ea82e" (UID: "b1258405-ec09-4b2b-ad14-5710eb5ea82e"). InnerVolumeSpecName "kube-api-access-946nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:06:05 crc kubenswrapper[4722]: I0226 21:06:05.031814 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-946nc\" (UniqueName: \"kubernetes.io/projected/b1258405-ec09-4b2b-ad14-5710eb5ea82e-kube-api-access-946nc\") on node \"crc\" DevicePath \"\"" Feb 26 21:06:05 crc kubenswrapper[4722]: I0226 21:06:05.276252 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535666-7b958" event={"ID":"b1258405-ec09-4b2b-ad14-5710eb5ea82e","Type":"ContainerDied","Data":"7c44593b3fa1c8924ba4d6b526ce795a0a477c9b246e2615b8633b7535d3f0e2"} Feb 26 21:06:05 crc kubenswrapper[4722]: I0226 21:06:05.276297 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c44593b3fa1c8924ba4d6b526ce795a0a477c9b246e2615b8633b7535d3f0e2" Feb 26 21:06:05 crc kubenswrapper[4722]: I0226 21:06:05.276294 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:05 crc kubenswrapper[4722]: I0226 21:06:05.873243 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535660-jljv7"] Feb 26 21:06:05 crc kubenswrapper[4722]: I0226 21:06:05.885776 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535660-jljv7"] Feb 26 21:06:06 crc kubenswrapper[4722]: I0226 21:06:06.164839 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e74dca-b32b-4f09-8ceb-f66d79bac2f2" path="/var/lib/kubelet/pods/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2/volumes" Feb 26 21:06:23 crc kubenswrapper[4722]: I0226 21:06:23.487079 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 21:06:23 crc kubenswrapper[4722]: I0226 21:06:23.488061 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 21:06:30 crc kubenswrapper[4722]: I0226 21:06:30.230768 4722 scope.go:117] "RemoveContainer" containerID="86037d9ba687a6cdd75df949c40534910d64ca12216f7d1464856810a7c3619c" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.036752 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gh6cx"] Feb 26 21:06:52 crc kubenswrapper[4722]: E0226 21:06:52.037729 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1258405-ec09-4b2b-ad14-5710eb5ea82e" containerName="oc" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.037743 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1258405-ec09-4b2b-ad14-5710eb5ea82e" containerName="oc" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.037931 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1258405-ec09-4b2b-ad14-5710eb5ea82e" containerName="oc" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.039541 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.057174 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh6cx"] Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.166119 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-catalog-content\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.166193 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxsbc\" (UniqueName: \"kubernetes.io/projected/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-kube-api-access-vxsbc\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.166251 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-utilities\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.268059 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-catalog-content\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.268107 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxsbc\" (UniqueName: \"kubernetes.io/projected/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-kube-api-access-vxsbc\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.268210 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-utilities\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.269015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-catalog-content\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.269237 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-utilities\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.604577 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxsbc\" (UniqueName: \"kubernetes.io/projected/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-kube-api-access-vxsbc\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.655626 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.176528 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh6cx"] Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.487561 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.487909 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.487954 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.488724 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.488779 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" gracePeriod=600 Feb 26 21:06:53 crc kubenswrapper[4722]: E0226 21:06:53.619995 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.768969 4722 generic.go:334] "Generic (PLEG): container finished" podID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerID="22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f" exitCode=0 Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.769071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerDied","Data":"22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f"} Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.769122 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerStarted","Data":"f02c0f2b938c77627bc862b126dc2a802e34f2da59b5b54764916bf6c637e9c0"} Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.774401 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" exitCode=0 Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.774454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77"} Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.774539 4722 scope.go:117] "RemoveContainer" containerID="16ec12d6b5bec63a6526ad9b6c9c476723f1f33b7f2af892b8071e40154eee61" Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.775201 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:06:53 crc kubenswrapper[4722]: E0226 21:06:53.775466 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:06:54 crc kubenswrapper[4722]: I0226 21:06:54.788250 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerStarted","Data":"672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193"} Feb 26 21:06:55 crc kubenswrapper[4722]: I0226 21:06:55.799327 4722 generic.go:334] "Generic (PLEG): container finished" podID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerID="672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193" exitCode=0 Feb 26 21:06:55 crc kubenswrapper[4722]: I0226 21:06:55.799424 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerDied","Data":"672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193"} Feb 26 21:06:56 crc kubenswrapper[4722]: I0226 21:06:56.810049 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerStarted","Data":"bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8"} Feb 26 21:06:56 crc kubenswrapper[4722]: I0226 21:06:56.834854 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gh6cx" podStartSLOduration=2.418289529 podStartE2EDuration="4.834833938s" podCreationTimestamp="2026-02-26 21:06:52 +0000 UTC" firstStartedPulling="2026-02-26 21:06:53.770502608 +0000 UTC m=+4356.307470532" lastFinishedPulling="2026-02-26 21:06:56.187047017 +0000 UTC m=+4358.724014941" observedRunningTime="2026-02-26 21:06:56.829502224 +0000 UTC m=+4359.366470168" watchObservedRunningTime="2026-02-26 21:06:56.834833938 +0000 UTC m=+4359.371801862" Feb 26 21:07:02 crc kubenswrapper[4722]: I0226 21:07:02.656249 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:07:02 crc kubenswrapper[4722]: I0226 21:07:02.657988 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:07:03 crc kubenswrapper[4722]: I0226 21:07:03.249084 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:07:03 crc kubenswrapper[4722]: I0226 21:07:03.305369 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:07:03 crc kubenswrapper[4722]: I0226 21:07:03.497324 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh6cx"] Feb 26 21:07:04 crc kubenswrapper[4722]: I0226 21:07:04.896162 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gh6cx" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="registry-server" containerID="cri-o://bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8" gracePeriod=2 Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.145914 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:07:05 crc kubenswrapper[4722]: E0226 21:07:05.146618 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.512409 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.636527 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-utilities\") pod \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.636621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-catalog-content\") pod \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.636684 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxsbc\" (UniqueName: \"kubernetes.io/projected/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-kube-api-access-vxsbc\") pod \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.638806 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-utilities" (OuterVolumeSpecName: "utilities") pod "d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" (UID: "d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.642890 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-kube-api-access-vxsbc" (OuterVolumeSpecName: "kube-api-access-vxsbc") pod "d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" (UID: "d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45"). InnerVolumeSpecName "kube-api-access-vxsbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.672980 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" (UID: "d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.738859 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.738909 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.738926 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxsbc\" (UniqueName: \"kubernetes.io/projected/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-kube-api-access-vxsbc\") on node \"crc\" DevicePath \"\"" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.906681 4722 generic.go:334] "Generic (PLEG): container finished" podID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerID="bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8" exitCode=0 Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.906740 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerDied","Data":"bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8"} Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.906774 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerDied","Data":"f02c0f2b938c77627bc862b126dc2a802e34f2da59b5b54764916bf6c637e9c0"} Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.906797 4722 scope.go:117] "RemoveContainer" containerID="bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.906946 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.932582 4722 scope.go:117] "RemoveContainer" containerID="672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.954217 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh6cx"] Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.964604 4722 scope.go:117] "RemoveContainer" containerID="22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.965319 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh6cx"] Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.014429 4722 scope.go:117] "RemoveContainer" containerID="bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8" Feb 26 21:07:06 crc kubenswrapper[4722]: E0226 21:07:06.014943 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8\": container with ID starting with bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8 not found: ID does not exist" containerID="bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8" Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.014992 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8"} err="failed to get container status \"bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8\": rpc error: code = NotFound desc = could not find container \"bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8\": container with ID starting with bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8 not found: ID does not exist" Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.015014 4722 scope.go:117] "RemoveContainer" containerID="672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193" Feb 26 21:07:06 crc kubenswrapper[4722]: E0226 21:07:06.015413 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193\": container with ID starting with 672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193 not found: ID does not exist" containerID="672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193" Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.015464 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193"} err="failed to get container status \"672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193\": rpc error: code = NotFound desc = could not find container \"672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193\": container with ID starting with 672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193 not found: ID does not exist" Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.015509 4722 scope.go:117] "RemoveContainer" containerID="22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f" Feb 26 21:07:06 crc kubenswrapper[4722]: E0226 21:07:06.015859 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f\": container with ID starting with 22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f not found: ID does not exist" containerID="22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f" Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.015900 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f"} err="failed to get container status \"22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f\": rpc error: code = NotFound desc = could not find container \"22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f\": container with ID starting with 22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f not found: ID does not exist" Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.173457 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" path="/var/lib/kubelet/pods/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45/volumes" Feb 26 21:07:16 crc kubenswrapper[4722]: I0226 21:07:16.146396 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:07:16 crc kubenswrapper[4722]: E0226 21:07:16.147028 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:07:30 crc kubenswrapper[4722]: I0226 21:07:30.146630 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:07:30 crc kubenswrapper[4722]: E0226 21:07:30.147469 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:07:42 crc kubenswrapper[4722]: I0226 21:07:42.146865 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:07:42 crc kubenswrapper[4722]: E0226 21:07:42.147724 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:07:57 crc kubenswrapper[4722]: I0226 21:07:57.146023 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:07:57 crc kubenswrapper[4722]: E0226 21:07:57.147043 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.156973 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535668-jfs9h"] Feb 26 21:08:00 crc kubenswrapper[4722]: E0226 21:08:00.157603 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="extract-utilities" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.157616 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="extract-utilities" Feb 26 21:08:00 crc kubenswrapper[4722]: E0226 21:08:00.157643 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="registry-server" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.157649 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="registry-server" Feb 26 21:08:00 crc kubenswrapper[4722]: E0226 21:08:00.157663 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="extract-content" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.157668 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="extract-content" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.157847 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="registry-server" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.158534 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.163358 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.164167 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.165321 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.174241 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535668-jfs9h"] Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.212075 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79r5w\" (UniqueName: \"kubernetes.io/projected/ca8fb463-a781-487f-a648-ab2cf63b5e89-kube-api-access-79r5w\") pod \"auto-csr-approver-29535668-jfs9h\" (UID: \"ca8fb463-a781-487f-a648-ab2cf63b5e89\") " pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.314116 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79r5w\" (UniqueName: \"kubernetes.io/projected/ca8fb463-a781-487f-a648-ab2cf63b5e89-kube-api-access-79r5w\") pod \"auto-csr-approver-29535668-jfs9h\" (UID: \"ca8fb463-a781-487f-a648-ab2cf63b5e89\") " pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.339630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79r5w\" (UniqueName: \"kubernetes.io/projected/ca8fb463-a781-487f-a648-ab2cf63b5e89-kube-api-access-79r5w\") pod \"auto-csr-approver-29535668-jfs9h\" (UID: \"ca8fb463-a781-487f-a648-ab2cf63b5e89\") " pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.479567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.976769 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535668-jfs9h"] Feb 26 21:08:01 crc kubenswrapper[4722]: I0226 21:08:01.464572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" event={"ID":"ca8fb463-a781-487f-a648-ab2cf63b5e89","Type":"ContainerStarted","Data":"734e40a8298c68463364010482fcfdaaf298e69bd2a3127c0987af972ec1df8b"} Feb 26 21:08:04 crc kubenswrapper[4722]: I0226 21:08:04.493395 4722 generic.go:334] "Generic (PLEG): container finished" podID="ca8fb463-a781-487f-a648-ab2cf63b5e89" containerID="b788dcb4b0fb367ed2d9735b9baf9ed83497eebd895dcfbcc173f3f02a0273b8" exitCode=0 Feb 26 21:08:04 crc kubenswrapper[4722]: I0226 21:08:04.494521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" event={"ID":"ca8fb463-a781-487f-a648-ab2cf63b5e89","Type":"ContainerDied","Data":"b788dcb4b0fb367ed2d9735b9baf9ed83497eebd895dcfbcc173f3f02a0273b8"} Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.044194 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.129878 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79r5w\" (UniqueName: \"kubernetes.io/projected/ca8fb463-a781-487f-a648-ab2cf63b5e89-kube-api-access-79r5w\") pod \"ca8fb463-a781-487f-a648-ab2cf63b5e89\" (UID: \"ca8fb463-a781-487f-a648-ab2cf63b5e89\") " Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.152408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca8fb463-a781-487f-a648-ab2cf63b5e89-kube-api-access-79r5w" (OuterVolumeSpecName: "kube-api-access-79r5w") pod "ca8fb463-a781-487f-a648-ab2cf63b5e89" (UID: "ca8fb463-a781-487f-a648-ab2cf63b5e89"). InnerVolumeSpecName "kube-api-access-79r5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.233968 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79r5w\" (UniqueName: \"kubernetes.io/projected/ca8fb463-a781-487f-a648-ab2cf63b5e89-kube-api-access-79r5w\") on node \"crc\" DevicePath \"\"" Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.516500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" event={"ID":"ca8fb463-a781-487f-a648-ab2cf63b5e89","Type":"ContainerDied","Data":"734e40a8298c68463364010482fcfdaaf298e69bd2a3127c0987af972ec1df8b"} Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.516868 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="734e40a8298c68463364010482fcfdaaf298e69bd2a3127c0987af972ec1df8b" Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.517058 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:07 crc kubenswrapper[4722]: I0226 21:08:07.114223 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535662-pb2zv"] Feb 26 21:08:07 crc kubenswrapper[4722]: I0226 21:08:07.123959 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535662-pb2zv"] Feb 26 21:08:08 crc kubenswrapper[4722]: I0226 21:08:08.167484 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85353cc8-0b88-4e2a-8442-6599665e4037" path="/var/lib/kubelet/pods/85353cc8-0b88-4e2a-8442-6599665e4037/volumes" Feb 26 21:08:12 crc kubenswrapper[4722]: I0226 21:08:12.146532 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:08:12 crc kubenswrapper[4722]: E0226 21:08:12.147463 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818"